910 resultados para Time delays
Resumo:
The Internet presents a constantly evolving frontier for criminology and policing, especially in relation to online predators – paedophiles operating within the Internet for safer access to children, child pornography and networking opportunities with other online predators. The goals of this qualitative study are to undertake behavioural research – identify personality types and archetypes of online predators and compare and contrast them with behavioural profiles and other psychological research on offline paedophiles and sex offenders. It is also an endeavour to gather intelligence on the technological utilisation of online predators and conduct observational research on the social structures of online predator communities. These goals were achieved through the covert monitoring and logging of public activity within four Internet Relay Chat(rooms) (IRC) themed around child sexual abuse and which were located on the Undernet network. Five days of monitoring was conducted on these four chatrooms between Wednesday 1 to Sunday 5 April 2009; this raw data was collated and analysed. The analysis identified four personality types – the gentleman predator, the sadist, the businessman and the pretender – and eight archetypes consisting of the groomers, dealers, negotiators, roleplayers, networkers, chat requestors, posters and travellers. The characteristics and traits of these personality types and archetypes, which were extracted from the literature dealing with offline paedophiles and sex offenders, are detailed and contrasted against the online sexual predators identified within the chatrooms, revealing many similarities and interesting differences particularly with the businessman and pretender personality types. These personality types and archetypes were illustrated by selecting users who displayed the appropriate characteristics and tracking them through the four chatrooms, revealing intelligence data on the use of proxies servers – especially via the Tor software – and other security strategies such as Undernet’s host masking service. Name and age changes, which is used as a potential sexual grooming tactic was also revealed through the use of Analyst’s Notebook software and information on ISP information revealed the likelihood that many online predators were not using any safety mechanism and relying on the anonymity of the Internet. The activities of these online predators were analysed, especially in regards to child sexual grooming and the ‘posting’ of child pornography, which revealed a few of the methods in which online predators utilised new Internet technologies to sexually groom and abuse children – using technologies such as instant messengers, webcams and microphones – as well as store and disseminate illegal materials on image sharing websites and peer-to-peer software such as Gigatribe. Analysis of the social structures of the chatrooms was also carried out and the community functions and characteristics of each chatroom explored. The findings of this research have indicated several opportunities for further research. As a result of this research, recommendations are given on policy, prevention and response strategies with regards to online predators.
Resumo:
This paper presents a method of voice activity detection (VAD) suitable for high noise scenarios, based on the fusion of two complementary systems. The first system uses a proposed non-Gaussianity score (NGS) feature based on normal probability testing. The second system employs a histogram distance score (HDS) feature that detects changes in the signal through conducting a template-based similarity measure between adjacent frames. The decision outputs by the two systems are then merged using an open-by-reconstruction fusion stage. Accuracy of the proposed method was compared to several baseline VAD methods on a database created using real recordings of a variety of high-noise environments.
Resumo:
The city of Scottsdale Arizona implemented the first fixed photo Speed Enforcement camera demonstration Program (SEP) on a US freeway in 2006. A comprehensive before-and-after analysis of the impact of the SEP on safety revealed significant reductions in crash frequency and severity, which indicates that the SEP is a promising countermeasure for improving safety. However, there is often a trade off between safety and mobility when safety investments are considered. As a result, identifying safety countermeasures that both improve safety and reduce Travel Time Variability (TTV) is a desirable goal for traffic safety engineers. This paper reports on the analysis of the mobility impacts of the SEP by simulating the traffic network with and without the SEP, calibrated to real world conditions. The simulation results show that the SEP decreased the TTV: the risk of unreliable travel was at least 23% higher in the ‘without SEP’ scenario than in the ‘with SEP’ scenario. In addition, the total Travel Time Savings (TTS) from the SEP was estimated to be at least ‘569 vehicle-hours/year.’ Consequently, the SEP is an efficient countermeasure not only for reducing crashes but also for improving mobility through TTS and reduced TTV.
Resumo:
This paper presents a method of voice activity detection (VAD) for high noise scenarios, using a noise robust voiced speech detection feature. The developed method is based on the fusion of two systems. The first system utilises the maximum peak of the normalised time-domain autocorrelation function (MaxPeak). The second zone system uses a novel combination of cross-correlation and zero-crossing rate of the normalised autocorrelation to approximate a measure of signal pitch and periodicity (CrossCorr) that is hypothesised to be noise robust. The score outputs by the two systems are then merged using weighted sum fusion to create the proposed autocorrelation zero-crossing rate (AZR) VAD. Accuracy of AZR was compared to state of the art and standardised VAD methods and was shown to outperform the best performing system with an average relative improvement of 24.8% in half-total error rate (HTER) on the QUT-NOISE-TIMIT database created using real recordings from high-noise environments.
Resumo:
We assess the performance of an exponential integrator for advancing stiff, semidiscrete formulations of the unsaturated Richards equation in time. The scheme is of second order and explicit in nature but requires the action of the matrix function φ(A) where φ(z) = [exp(z) - 1]/z on a suitability defined vector v at each time step. When the matrix A is large and sparse, φ(A)v can be approximated by Krylov subspace methods that require only matrix-vector products with A. We prove that despite the use of this approximation the scheme remains second order. Furthermore, we provide a practical variable-stepsize implementation of the integrator by deriving an estimate of the local error that requires only a single additional function evaluation. Numerical experiments performed on two-dimensional test problems demonstrate that this implementation outperforms second-order, variable-stepsize implementations of the backward differentiation formulae.
Resumo:
In total, 782 Escherichia coli strains originating from various host sources have been analyzed in this study by using a highly discriminatory single-nucleotide polymorphism (SNP) approach. A set of eight SNPs, with a discrimination value (Simpson's index of diversity [D]) of 0.96, was determined using the Minimum SNPs software, based on sequences of housekeeping genes from the E. coli multilocus sequence typing (MLST) database. Allele-specific real-time PCR was used to screen 114 E. coli isolates from various fecal sources in Southeast Queensland (SEQ). The combined analysis of both the MLST database and SEQ E. coli isolates using eight high-D SNPs resolved the isolates into 74 SNP profiles. The data obtained suggest that SNP typing is a promising approach for the discrimination of host-specific groups and allows for the identification of human-specific E. coli in environmental samples. However, a more diverse E. coli collection is required to determine animal- and environment-specific E. coli SNP profiles due to the abundance of human E. coli strains (56%) in the MLST database.
Resumo:
As AITPM National President, I was invited by Queensland’s Premier, Hon. Anna Bligh MP, as an audience guest to People’s Question Time on Wednesday 24 March 2010, which focused on ‘The Challenges and Opportunities of Population Growth in Queensland’. On the panel were: Premier and Minister for the Arts, Anna Bligh; Minister for Climate Change and Sustainability, Kate Jones; Minister for Infrastructure and Planning, Stirling Hinchliffe; Michael Rayner – Growth Management Summit Advisory Panel, Principal Director, Cox Rayner Architects; and Greg Hallam – Executive Director, Local Government Association of Queensland. The moderator for this session was Law Academic Erin O’Brien, of Queensland University of Technology.
Resumo:
The common approach to estimate bus dwell time at a BRT station is to apply the traditional dwell time methodology derived for suburban bus stops. In spite of being sensitive to boarding and alighting passenger numbers and to some extent towards fare collection media, these traditional dwell time models do not account for the platform crowding. Moreover, they fall short in accounting for the effects of passenger/s walking along a relatively longer BRT platform. Using the experience from Brisbane busway (BRT) stations, a new variable, Bus Lost Time (LT), is introduced in traditional dwell time model. The bus lost time variable captures the impact of passenger walking and platform crowding on bus dwell time. These are two characteristics which differentiate a BRT station from a bus stop. This paper reports the development of a methodology to estimate bus lost time experienced by buses at a BRT platform. Results were compared with the Transit Capacity and Quality of Servce Manual (TCQSM) approach of dwell time and station capacity estimation. When the bus lost time was used in dwell time calculations it was found that the BRT station platform capacity reduced by 10.1%.
Resumo:
The common approach to estimate bus dwell time at a BRT station platform is to apply the traditional dwell time methodology derived for suburban bus stops. Current dwell time models are sensitive towards bus type, fare collection policy along with the number of boarding and alighting passengers. However, they fall short in accounting for the effects of passenger/s walking on a relatively longer BRT station platform. Analysis presented in this paper shows that the average walking time of a passenger at BRT platform is 10 times more than that of bus stop. The requirement of walking to the bus entry door at the BRT station platform may lead to the bus experiencing a higher dwell time. This paper presents a theory for a BRT network which explains the loss of station capacity during peak period operation. It also highlights shortcomings of present available bus dwell time models suggested for the analysis of BRT operation.
Resumo:
Background: The 2003 Bureau of Labor Statistics American Time Use Survey (ATUS) contains 438 distinct primary activity variables that can be analyzed with regard to how time is spent by Americans. The Compendium of Physical Activities is used to code physical activities derived from various surveys, logs, diaries, etc to facilitate comparison of coded intensity levels across studies. ------ ----- Methods: This paper describes the methods, challenges, and rationale for linking Compendium estimates of physical activity intensity (METs, metabolic equivalents) with all activities reported in the 2003 ATUS. ----- ----- Results: The assigned ATUS intensity levels are not intended to compute the energy costs of physical activity in individuals. Instead, they are intended to be used to identify time spent in activities broadly classified by type and intensity. This function will complement public health surveillance systems and aid in policy and health-promotion activities. For example, at least one of the future projects of this process is the descriptive epidemiology of time spent in common physical activity intensity categories. ----- ----- Conclusions: The process of metabolic coding of the ATUS by linking it with the Compendium of Physical Activities can make important contributions to our understanding of Americans’ time spent in health-related physical activity.
Resumo:
Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.
Resumo:
Where airports were once the sole responsibility of their governments, liberalisation of economies has seen administrative interests in airport spaces divested increasingly towards market led authority. Extant literature suggests that actions in decision spaces can be described under broad idealised forms of governance. However in looking at a sample of 18 different airports it is apparent that these classic models are insufficient to appreciate the contextual complexity of each case. Issues of institutional arrangements, privatisation, and management focus are reviewed against existing governance modes to produce a model for informing privatisation decisions, based on the contextual needs of the individual airport and region. Expanding governance modes to include emergent airport arrangements both contribute to the existing literature, and provides a framework to assist policy makers and those charged with the operation of airports to design effective governance models. In progressing this framework, contributions are made to government decision makers for the development of new, or review of existing strategies for privatisation, while the private sector can identify the intent and expectations of privatisation initiatives to make better informed decisions.
Resumo:
Purpose: The purpose of this paper is to provide a labour process theory interpretation of four case studies within the Australian construction industry. In each case study a working time intervention (a shift to a five-day working week from the industry standard six days) was implemented as an attempt to improve the work-life balance of employees. ----- ----- Design/methodology/approach: This paper was based on four case studies with mixed methods. Each case study has a variety of data collection methods which include questionnaires, short and long interviews, and focus groups. ----- ----- Findings: It was found that the complex mix of wage- and salary-earning staff within the construction industry, along with labour market pressures, means that changing to a five-day working week is quite a radical notion within the industry. However, there are some organisations willing to explore opportunities for change with mixed experiences. ----- ----- Practical implications: The practical implications of this research include understanding the complexity within the Australian construction industry, based around hours of work and pay systems. Decision-makers within the construction industry must recognize a range of competing pressures that mean that “preferred” managerial styles might not be appropriate. ----- ----- Originality/value:– This paper shows that construction firms must take an active approach to reducing the culture of long working hours. This can only be achieved by addressing issues of project timelines and budgets and assuring that take-home pay is not reliant on long hours of overtime.
Resumo:
This article examined the relationship between time structure and Macan's process model of time management. This study proposed that time structure—‘appraisal of effective time usage’—would be a more parsimonious mediator than perceived control over time in the relationship between time management behaviours and outcome variables, such as job satisfaction and psychological well-being. Alternative structure models were compared using a sample of 111 university students. Model 1 tested Macan's process model of time management with perceived control over time as the mediator. Model 2 replaced perceived control over time by the construct of time structure. Model 3 examined the possibility of perceived control over time and time structure as being parallel mediators of the relationships between time management and outcomes. Results of this study showed that Model 1 and Model 2 fitted the data equally well. On the other hand, the mediated effects were small and partial in both models. This pattern of results calls for reassessment of the process model.