976 resultados para Handling time
Resumo:
BACKGROUND: Hot and cold temperatures have been associated with childhood asthma. However, the relationship between daily temperature variation and childhood asthma is not well understood. This study aimed to examine the relationship between diurnal temperature range (DTR) and childhood asthma. METHODS: A Poisson generalized linear model combined with a distributed lag non-linear model was used to examine the relationship between DTR and emergency department admissions for childhood asthma in Brisbane, from January 1st 2003 to December 31st 2009. RESULTS: There was a statistically significant relationship between DTR and childhood asthma. The DTR effect on childhood asthma increased above a DTR of 10[degree sign]C. The effect of DTR on childhood asthma was the greatest for lag 0--9 days, with a 31% (95% confidence interval: 11% -- 58%) increase of emergency department admissions per 5[degree sign]C increment of DTR. Male children and children aged 5--9 years appeared to be more vulnerable to the DTR effect than others. CONCLUSIONS: Large DTR may trigger childhood asthma. Future measures to control and prevent childhood asthma should include taking temperature variability into account. More protective measures should be taken after a day of DTR above10[degree sign]C.
Resumo:
Topic recommendation can help users deal with the information overload issue in micro-blogging communities. This paper proposes to use the implicit information network formed by the multiple relationships among users, topics and micro-blogs, and the temporal information of micro-blogs to find semantically and temporally relevant topics of each topic, and to profile users' time-drifting topic interests. The Content based, Nearest Neighborhood based and Matrix Factorization models are used to make personalized recommendations. The effectiveness of the proposed approaches is demonstrated in the experiments conducted on a real world dataset that collected from Twitter.com.
Resumo:
Goethite and Al-substituted goethite were synthesized from the reaction between ferric nitrate and/or aluminum nitrate and potassium hydroxide. XRF, XRD, TEM with EDS were used to characterize the chemical composition, phase and lattice parameters, and morphology of the synthesized products. The results show that d(020) decreases from 4.953 to 4.949 Å and the b dimension decreases from 9.951 Å to 9.906 Å when the aging time increases from 6 days to 42 days for 9.09 mol% Al-substituted goethite. A sample with 9.09 mol% Al substitution in Al-substituted goethite was prepared by a rapid co-precipitation method. In the sample, 13.45 mol%, 12.31 mol% and 5.85 mol% Al substitution with a crystal size of 163, 131, and 45 nm are observed as shown in the TEM images and EDS. The crystal size of goethite is positively related to the degree of Al substitution according to the TEM images and EDS results. Thus, this methodology is proved to be effective to distinguish the morphology of goethite and Al substituted goethite.
Resumo:
Modern mobile computing devices are versatile, but bring the burden of constant settings adjustment according to the current conditions of the environment. While until today, this task has to be accomplished by the human user, the variety of sensors usually deployed in such a handset provides enough data for autonomous self-configuration by a learning, adaptive system. However, this data is not fully available at certain points in time, or can contain false values. Handling potentially incomplete sensor data to detect context changes without a semantic layer represents a scientific challenge which we address with our approach. A novel machine learning technique is presented - the Missing-Values-SOM - which solves this problem by predicting setting adjustments based on context information. Our method is centered around a self-organizing map, extending it to provide a means of handling missing values. We demonstrate the performance of our approach on mobile context snapshots, as well as on classical machine learning datasets.
Resumo:
Current diagnostic methods for assessing the severity of articular cartilage degenerative conditions, such as osteoarthritis, are inadequate. There is also a lack of techniques that can be used for real-time evaluation of the tissue during surgery to inform treatment decision and eliminate subjectivity. This book, derived from Dr Afara’s doctoral research, presents a scientific framework that is based on near infrared (NIR) spectroscopy for facilitating the non-destructive evaluation of articular cartilage health relative to its structural, functional, and mechanical properties. This development is a component of the ongoing research on advanced endoscopic diagnostic techniques in the Articular Cartilage Biomechanics Research Laboratory of Professor Adekunle Oloyede at Queensland University of Technology (QUT), Brisbane Australia.
Resumo:
The GameFlow model strives to be a general model of player enjoyment, applicable to all game genres and platforms. Derived from a general set of heuristics for creating enjoyable player experiences, the GameFlow model has been widely used in evaluating many types of games, as well as non-game applications. However, we recognize that more specific, low-level, and implementable criteria are potentially more useful for designing and evaluating video games. Consequently, the research reported in this paper aims to provide detailed heuristics for designing and evaluating one specific game genre, real-time strategy games. In order to develop these heuristics, we conducted a grounded theoretical analysis on a set of professional game reviews and structured the resulting heuristics using the GameFlow model. The resulting 165 heuristics for designing and evaluating real-time strategy games are presented and discussed in this paper.
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
This paper presents a methodology for real-time estimation of exit movement-specific average travel time on urban routes by integrating real-time cumulative plots, probe vehicles, and historic cumulative plots. Two approaches, component based and extreme based, are discussed for route travel time estimation. The methodology is tested with simulation and is validated with real data from Lucerne, Switzerland, that demonstrate its potential for accurate estimation. Both approaches provide similar results. The component-based approach is more reliable, with a greater chance of obtaining a probe vehicle in each interval, although additional data from each component is required. The extreme-based approach is simple and requires only data from upstream and downstream of the route, but the chances of obtaining a probe that traverses the entire route might be low. The performance of the methodology is also compared with a probe-only method. The proposed methodology requires only a few probes for accurate estimation; the probe-only method requires significantly more probes.
Resumo:
This article proposes an approach for real-time monitoring of risks in executable business process models. The approach considers risks in all phases of the business process management lifecycle, from process design, where risks are defined on top of process models, through to process diagnosis, where risks are detected during process execution. The approach has been realized via a distributed, sensor-based architecture. At design-time, sensors are defined to specify risk conditions which when fulfilled, are a likely indicator of negative process states (faults) to eventuate. Both historical and current process execution data can be used to compose such conditions. At run-time, each sensor independently notifies a sensor manager when a risk is detected. In turn, the sensor manager interacts with the monitoring component of a business process management system to prompt the results to process administrators who may take remedial actions. The proposed architecture has been implemented on top of the YAWL system, and evaluated through performance measurements and usability tests with students. The results show that risk conditions can be computed efficiently and that the approach is perceived as useful by the participants in the tests.
Resumo:
Theoretical foundations of higher order spectral analysis are revisited to examine the use of time-varying bicoherence on non-stationary signals using a classical short-time Fourier approach. A methodology is developed to apply this to evoked EEG responses where a stimulus-locked time reference is available. Short-time windowed ensembles of the response at the same offset from the reference are considered as ergodic cyclostationary processes within a non-stationary random process. Bicoherence can be estimated reliably with known levels at which it is significantly different from zero and can be tracked as a function of offset from the stimulus. When this methodology is applied to multi-channel EEG, it is possible to obtain information about phase synchronization at different regions of the brain as the neural response develops. The methodology is applied to analyze evoked EEG response to flash visual stimulii to the left and right eye separately. The EEG electrode array is segmented based on bicoherence evolution with time using the mean absolute difference as a measure of dissimilarity. Segment maps confirm the importance of the occipital region in visual processing and demonstrate a link between the frontal and occipital regions during the response. Maps are constructed using bicoherence at bifrequencies that include the alpha band frequency of 8Hz as well as 4 and 20Hz. Differences are observed between responses from the left eye and the right eye, and also between subjects. The methodology shows potential as a neurological functional imaging technique that can be further developed for diagnosis and monitoring using scalp EEG which is less invasive and less expensive than magnetic resonance imaging.
Resumo:
An optical system which performs the multiplication of binary numbers is described and proof-of-principle experiments are performed. The simultaneous generation of all partial products, optical regrouping of bit products, and optical carry look-ahead addition are novel features of the proposed scheme which takes advantage of the parallel operations capability of optical computers. The proposed processor uses liquid crystal light valves (LCLVs). By space-sharing the LCLVs one such system could function as an array of multipliers. Together with the optical carry look-ahead adders described, this would constitute an optical matrix-vector multiplier.
Resumo:
Background: When experiencing sleep problems for the first time, consumers may often approach community pharmacists for advice as they are easily accessible health care professionals in the community. In Australian community pharmacies there are no specific tools available for use by pharmacists to assist with the assessment and handling of consumers with sleep enquiries. Objective: To assess the feasibility of improving the detection of sleep disorders within the community through the pilot of a newly developed Community Pharmacy Sleep Assessment Tool (COP-SAT). Method: The COP-SAT was designed to incorporate elements from a number of existing, standardized, and validated clinical screening measures. The COP-SAT was trialed in four Australian community pharmacies over a 4-week period. Key findings: A total of 241 community pharmacy consumers were assessed using the COP-SAT. A total of 74 (30.7%) were assessed as being at risk of insomnia, 26 (10.7%) were at risk of daytime sleepiness, 19 (7.9%) were at risk of obstructive sleep apnea, and 121 (50.2%) were regular snorers. A total of 116 (48.1%) participants indicated that they consume caffeine before bedtime, of which 55 (47%) had associated symptoms of sleep onset insomnia. Moreover, 85 (35%) consumed alcohol before bedtime, of which 50 (58%) experienced fragmented sleep, 50 (58%) were regular snorers, and nine (10.6%) had apnea symptoms. The COP-SAT was feasible in the community pharmacy setting. The prevalence of sleep disorders in the sampled population was high, but generally consistent with previous studies on the general population. Conclusion: A large proportion of participants reported sleep disorder symptoms, and a link was found between the consumption of alcohol and caffeine substances at bedtime and associated symptoms. While larger studies are needed to assess the clinical properties of the tool, the results of this feasibility study have demonstrated that the COP-SAT may be a practical tool for the identification of patients at risk of developing sleep disorders in the community.
Resumo:
We found that scientists in Australia spent more than five centuries' worth of time preparing research-grant proposals for consideration by the largest funding scheme of 2012. Because just 20.5% of these applications were successful, the equivalent of some four centuries of effort returned no immediate benefit to researchers and wasted valuable research time. The system needs reforming and alternative funding processes should be investigated...
Resumo:
Widespread adoption by electricity utilities of Non-Conventional Instrument Transformers, such as optical or capacitive transducers, has been limited due to the lack of a standardised interface and multi-vendor interoperability. Low power analogue interfaces are being replaced by IEC 61850 9 2 and IEC 61869 9 digital interfaces that use Ethernet networks for communication. These ‘process bus’ connections achieve significant cost savings by simplifying connections between switchyard and control rooms; however the in-service performance when these standards are employed is largely unknown. The performance of real-time Ethernet networks and time synchronisation was assessed using a scale model of a substation automation system. The test bed was constructed from commercially available timing and protection equipment supplied by a range of vendors. Test protocols have been developed to thoroughly evaluate the performance of Ethernet networks and network based time synchronisation. The suitability of IEEE Std 1588 Precision Time Protocol (PTP) as a synchronising system for sampled values was tested in the steady state and under transient conditions. Similarly, the performance of hardened Ethernet switches designed for substation use was assessed under a range of network operating conditions. This paper presents test methods that use a precision Ethernet capture card to accurately measure PTP and network performance. These methods can be used for product selection and to assess ongoing system performance as substations age. Key findings on the behaviour of multi-function process bus networks are presented. System level tests were performed using a Real Time Digital Simulator and transformer protection relay with sampled value and Generic Object Oriented Substation Events (GOOSE) capability. These include the interactions between sampled values, PTP and GOOSE messages. Our research has demonstrated that several protocols can be used on a shared process bus, even with very high network loads. This should provide confidence that this technology is suitable for transmission substations.
Resumo:
Bus travel time estimation and prediction are two important modelling approaches which could facilitate transit users in using and transit providers in managing the public transport network. Bus travel time estimation could assist transit operators in understanding and improving the reliability of their systems and attracting more public transport users. On the other hand, bus travel time prediction is an important component of a traveller information system which could reduce the anxiety and stress for the travellers. This paper provides an insight into the characteristic of bus in traffic and the factors that influence bus travel time. A critical overview of the state-of-the-art in bus travel time estimation and prediction is provided and the needs for research in this important area are highlighted. The possibility of using Vehicle Identification Data (VID) for studying the relationship between bus and cars travel time is also explored.