108 resultados para Performance evolution due time
em Queensland University of Technology - ePrints Archive
Resumo:
Drivers' ability to react to unpredictable events deteriorates when exposed to highly predictable and uneventful driving tasks. Highway design reduces the driving task mainly to a lane-keeping manoeuvre. Such a task is monotonous, providing little stimulation and this contributes to crashes due to inattention. Research has shown that driver's hypovigilance can be assessed with EEG measurements and that driving performance is impaired during prolonged monotonous driving tasks. This paper aims to show that two dimensions of monotony - namely road design and road side variability - decrease vigilance and impair driving performance. This is the first study correlating hypovigilance and driver performance in varied monotonous conditions, particularly on a short time scale (a few seconds). We induced vigilance decrement as assessed with an EEG during a monotonous driving simulator experiment. Road monotony was varied through both road design and road side variability. The driver's decrease in vigilance occurred due to both road design and road scenery monotony and almost independently of the driver's sensation seeking level. Such impairment was also correlated to observable measurements from the driver, the car and the environment. During periods of hypovigilance, the driving performance impairment affected lane positioning, time to lane crossing, blink frequency, heart rate variability and non-specific electrodermal response rates. This work lays the foundation for the development of an in-vehicle device preventing hypovigilance crashes on monotonous roads.
Resumo:
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the a mission should be aborted due to mechanical or other failure. On-board cameras provide information that can be used in the determination of potential landing sites, which are continually updated and ranked to prevent injury and minimize damage. Pulse Coupled Neural Networks have been used for the detection of features in images that assist in the classification of vegetation and can be used to minimize damage to the aerial vehicle. However, a significant drawback in the use of PCNNs is that they are computationally expensive and have been more suited to off-line applications on conventional computing architectures. As heterogeneous computing architectures are becoming more common, an OpenCL implementation of a PCNN feature generator is presented and its performance is compared across OpenCL kernels designed for CPU, GPU and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images obtained during unmanned aerial vehicle trials to determine the plausibility for real-time feature detection.
Resumo:
Safety concerns in the operation of autonomous aerial systems require safe-landing protocols be followed during situations where the mission should be aborted due to mechanical or other failure. This article presents a pulse-coupled neural network (PCNN) to assist in the vegetation classification in a vision-based landing site detection system for an unmanned aircraft. We propose a heterogeneous computing architecture and an OpenCL implementation of a PCNN feature generator. Its performance is compared across OpenCL kernels designed for CPU, GPU, and FPGA platforms. This comparison examines the compute times required for network convergence under a variety of images to determine the plausibility for real-time feature detection.
Resumo:
With increasing investments being made in brand development by destination marketing organisations (DMO) since the 1990s, including rebranding and repositioning, more research is necessary to enhance understanding of how to effectively monitor destination brand performance over time. This chapter summarises key findings from a study of brand performance of a competitive set of destinations, in their most important market, between 2003 and 2012. Brand performance was measured from the perspective of consumer perceptions, based on the concept of consumer-based brand equity (CBBE). The results indicated almost no change in perceptions of the five destinations over the 10-year period. Due to the commonality of challenges faced by DMOs worldwide, it is suggested the CBBE hierarchy provides destination marketers with a practical tool for evaluating brand performance over time; in terms of measures of effectiveness of past marketing communications, as well as indicators of future performance.
Resumo:
Entrepreneurship research and practice places emphasis on company growth as a measure of entrepreneurial success. In many cases, there has been a tendency to give growth a very central role, with some researchers even seeing growth as the very essence of entrepreneurship (Cole, 1949; Sexton, 1997; Stevenson & Gumpert, 1991). A large number of empirical studies of the performance of young and/or small firms use growth as the dependent variable (see reviews by Ardishvili, Cardozo, Harmon, & Vadakath, 1998; Delmar, 1997; Wiklund, 1998). By contrast, the two most prominent views of strategic management – strategic positioning (Porter, 1980) and the resource-based view (Barney, 1991; Wernerfelt, 1984) – are both concerned with achieving competitive advantage and regard achieving economic rents and profitability relative to other competitors as the central measures of firm performance. Strategic entrepreneurship integrates these two perspectives and is simultaneously concerned with opportunity seeking and advantage seeking (Hitt, Ireland, Camp, & Sexton, 2002; Ireland, Hitt, & Sirmon, 2003). Consequently, both company growth and relative profitability are together relevant measures of firm performance in the domain of strategic entrepreneurship.
Resumo:
There has been a paucity of research published in relation to the temporal aspect of destination image change over time. Given increasing investments in destination branding, research is needed to enhance understanding of how to monitor destination brand performance, of which destination image is the core construct, over time. This article reports the results of four studies tracking brand performance of a competitive set of five destinations, between 2003 and 2012. Results indicate minimal changes in perceptions held of the five destinations of interest over the 10 years, supporting the assertion of Gartner (1986) and Gartner and Hunt (1987) that destination image change will only occur slowly over time. While undertaken in Australia, the research approach provides DMOs in other parts of the world with a practical tool for evaluating brand performance over time; in terms of measures of effectiveness of past marketing communications, and indicators of future performance.
Resumo:
The driving task requires sustained attention during prolonged periods, and can be performed in highly predictable or repetitive environments. Such conditions could create hypovigilance and impair performance towards critical events. Identifying such impairment in monotonous conditions has been a major subject of research, but no research to date has attempted to predict it in real-time. This pilot study aims to show that performance decrements due to monotonous tasks can be predicted through mathematical modelling taking into account sensation seeking levels. A short vigilance task sensitive to short periods of lapses of vigilance called Sustained Attention to Response Task is used to assess participants‟ performance. The framework for prediction developed on this task could be extended to a monotonous driving task. A Hidden Markov Model (HMM) is proposed to predict participants‟ lapses in alertness. Driver‟s vigilance evolution is modelled as a hidden state and is correlated to a surrogate measure: the participant‟s reactions time. This experiment shows that the monotony of the task can lead to an important decline in performance in less than five minutes. This impairment can be predicted four minutes in advance with an 86% accuracy using HMMs. This experiment showed that mathematical models such as HMM can efficiently predict hypovigilance through surrogate measures. The presented model could result in the development of an in-vehicle device that detects driver hypovigilance in advance and warn the driver accordingly, thus offering the potential to enhance road safety and prevent road crashes.
Resumo:
The design-build (DB) delivery method has been widely used in the United States due to its reputed superior cost and time performance. However, rigorous studies have produced inconclusive support and only in terms of overall results, with few attempts being made to relate project characteristics with performance levels. This paper provides a larger and more finely grained analysis of a set of 418 DB projects from the online project database of the Design-Build Institute of America (DBIA), in terms of the time-overrun rate (TOR), early start rate (ESR), early completion rate (ECR) and cost overrun rate (COR) associated with project type (e.g., commercial/institutional buildings and civil infrastructure projects), owners (e.g., Department of Defense and private corporations), procurement methods (e.g., ‘best value with discussion’ and qualifications-based selection), contract methods (e.g., lump sum and GMP) and LEED levels (e.g., gold and silver). The results show ‘best value with discussion’ to be the dominant procurement method and lump sum the most frequently used contract method. The DB method provides relatively good time performance, with more than 75% of DB projects completed on time or before schedule. However, with more than 50% of DB projects cost overrunning, the DB advantage of cost saving remains uncertain. ANOVA tests indicate that DB projects within different procurement methods have significantly different time performance and that different owner types and contract methods significantly affect cost performance. In addition to contributing to empirical knowledge concerning the cost and time performance of DB projects with new solid evidence from a large sample size, the findings and practical implications of this study are beneficial to owners in understanding the likely schedule and budget implications involved for their particular project characteristics.
Resumo:
In fisheries managed using individual transferable quotas (ITQs) it is generally assumed that quota markets are well-functioning, allowing quota to flow on either a temporary or permanent basis to those able to make best use of it. However, despite an increasing number of fisheries being managed under ITQs, empirical assessments of the quota markets that have actually evolved in these fisheries remain scarce. The Queensland Coral Reef Fin-Fish Fishery (CRFFF) on the Great Barrier Reef has been managed under a system of ITQs since 2004. Data on individual quota holdings and trades for the period 2004-2012 were used to assess the CRFFF quota market and its evolution through time. Network analysis was applied to assess market structure and the nature of lease-trading relationships. An assessment of market participants’ abilities to balance their quota accounts, i.e., gap analysis, provided insights into market functionality and how this may have changed in the period observed. Trends in ownership and trade were determined, and market participants were identified as belonging to one out of a set of seven generalized types. The emergence of groups such as investors and lease-dependent fishers is clear. In 2011-2012, 41% of coral trout quota was owned by participants that did not fish it, and 64% of total coral trout landings were made by fishers that owned only 10% of the quota. Quota brokers emerged whose influence on the market varied with the bioeconomic conditions of the fishery. Throughout the study period some quota was found to remain inactive, implying potential market inefficiencies. Contribution to this inactivity appeared asymmetrical, with most residing in the hands of smaller quota holders. The importance of transaction costs in the operation of the quota market and the inequalities that may result are discussed in light of these findings
Resumo:
This paper presents the benefits and issues related to travel time prediction on urban network. Travel time information quantifies congestion and is perhaps the most important network performance measure. Travel time prediction has been an active area of research for the last five decades. The activities related to ITS have increased the attention of researchers for better and accurate real-time prediction of travel time. Majority of the literature on travel time prediction is applicable to freeways where, under non-incident conditions, traffic flow is not affected by external factors such as traffic control signals and opposing traffic flows. On urban environment the problem is more complicated due to conflicting areas (intersections), mid-link sources and sinks etc. and needs to be addressed.
Resumo:
Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.
Resumo:
Purpose:Multifocal contact lenses (MCLs) have been available for decades. A review of the literature suggests that while, historically, these lenses have been partially successful, they have struggled to compete with monovision (MV). More recent publications suggest that there has been an improvement in the performance of these lenses. This study set out to investigate whether the apparent improved lens performance reported in the literature is reflected in clinical practice. Methods:Data collected over the last 5yrs via the International Contact Lens Prescribing Survey Consortium was reviewed for patients over the age of 45yrs. The published reports of clinical trials were reviewed to assess lens performance over the time period. Results:Data review was of 16,680 presbyopic lens fits in 38 countries. The results are that 29% were fit with MCLs, 8% MV and 63% single vision (SV). A previous survey conducted in Australia during 1988-89 reported that 9% of presbyopes were fit with MCLs, 29% MV and 63% SV. The results from our survey for Australia alone were 28% (MV 13%) vs 9% (MV 29%) suggesting an increase in usage of MCLs from 1988-89 to 2010. A review of the literature indicates the reported level of visual acuities with MCLs in comparison to MV has remained equivalent over this time period, yet preference has switch from MV to MCLs. Conclusions:There is evidence that currently more MCLs than MV are being fit to presbyopes, compared to 1988-89. This increased use is likely due to the improved visual performance of these lenses, which is not demonstrated with acuity measures but reported by wearers, suggesting that patient-based subjective ratings are currently the best way to measure visual performance.
Resumo:
Nuclei and electrons in condensed matter and/or molecules are usually entangled, due to the prevailing (mainly electromagnetic) interactions. However, the "environment" of a microscopic scattering system (e.g. a proton) causes ultrafast decoherence, thus making atomic and/or nuclear entanglement e®ects not directly accessible to experiments. However, our neutron Compton scattering experiments from protons (H-atoms) in condensed systems and molecules have a characteristic collisional time about 100|1000 attoseconds. The quantum dynamics of an atom in this ultrashort, but ¯nite, time window is governed by non-unitary time evolution due to the aforementioned decoherence. Unexpectedly, recent theoretical investigations have shown that decoherence can also have the following energetic consequences. Disentangling two subsystems A and B of a quantum system AB is tantamount to erasure of quantum phase relations between A and B. This erasure is widely believed to be an innocuous process, which e.g. does not a®ect the energies of A and B. However, two independent groups proved recently that disentangling two systems, within a su±ciently short time interval, causes increase of their energies. This is also derivable by the simplest Lindblad-type master equation of one particle being subject to pure decoherence. Our neutron-proton scattering experiments with H2 molecules provide for the first time experimental evidence of this e®ect. Our results reveal that the neutron-proton collision, leading to the cleavage of the H-H bond in the attosecond timescale, is accompanied by larger energy transfer (by about 2|3%) than conventional theory predicts. Preliminary results from current investigations show qualitatively the same e®ect in the neutron-deuteron Compton scattering from D2 molecules. We interpret the experimental findings by treating the neutron-proton (or neutron-deuteron) collisional system as an entangled open quantum system being subject to fast decoherence caused by its "environment" (i.e., two electrons plus second nucleus of H2 or D2). The presented results seem to be of generic nature, and may have considerable consequences for various processes in condensed matter and molecules, e.g. in elementary chemical reactions.
Resumo:
Magnetic behavior of soils can seriously hamper the performance of geophysical sensors. Currently, we have little understanding of the types of minerals responsible for the magnetic behavior, as well as their distribution in space and evolution through time. This study investigated the magnetic characteristics and mineralogy of Fe-rich soils developed on basaltic substrate in Hawaii. We measured the spatial distribution of magnetic susceptibility (χlf) and frequency dependence (χfd%) across three test areas in a well-developed eroded soil on Kaho'olawe and in two young soils on the Big Island of Hawaii. X-ray diffraction spectroscopy, x-ray fluorescence spectroscopy (XFCF), chemical dissolution, thermal analysis, and temperature-dependent magnetic studies were used to characterize soil development and mineralogy for samples from soil pits on Kaho'olawe, surface samples from all three test areas, and unweathered basalt from the Big Island of Hawaii. The measurements show a general increase in magnetic properties with increasing soil development. The XRF Fe data ranged from 13% for fresh basalt and young soils on the Big Island to 58% for material from the B horizon of Kaho'olawe soils. Dithionite-extractable and oxalate-extractable Fe percentages increase with soil development and correlate with χlf-and χfd%, respectively. Results from the temperature-dependent susceptibility measurements show that the high soil magnetic properties observed in geophysical surveys in Kaho'olawe are entirely due to neoformed minerals. The results of our studies have implications for the existing soil survey of Kaho'olawe and help identify methods to characterize magnetic minerals in tropical soils.
Resumo:
This study uses the reverse salient methodology to contrast subsystems in video game consoles in order to discover, characterize, and forecast the most significant technology gap. We build on the current methodologies (Performance Gap and Time Gap) for measuring the magnitude of Reverse Salience, by showing the effectiveness of Performance Gap Ratio (PGR). The three subject subsystems in this analysis are the CPU Score, GPU core frequency, and video memory bandwidth. CPU Score is a metric developed for this project, which is the product of the core frequency, number of parallel cores, and instruction size. We measure the Performance Gap of each subsystem against concurrently available PC hardware on the market. Using PGR, we normalize the evolution of these technologies for comparative analysis. The results indicate that while CPU performance has historically been the Reverse Salient, video memory bandwidth has taken over as the quickest growing technology gap in the current generation. Finally, we create a technology forecasting model that shows how much the video RAM bandwidth gap will grow through 2019 should the current trend continue. This analysis can assist console developers in assigning resources to the next generation of platforms, which will ultimately result in longer hardware life cycles.