770 resultados para time inconsistency
Resumo:
The rapid development of the World Wide Web has created massive information leading to the information overload problem. Under this circumstance, personalization techniques have been brought out to help users in finding content which meet their personalized interests or needs out of massively increasing information. User profiling techniques have performed the core role in this research. Traditionally, most user profiling techniques create user representations in a static way. However, changes of user interests may occur with time in real world applications. In this research we develop algorithms for mining user interests by integrating time decay mechanisms into topic-based user interest profiling. Time forgetting functions will be integrated into the calculation of topic interest measurements on in-depth level. The experimental study shows that, considering temporal effects of user interests by integrating time forgetting mechanisms shows better performance of recommendation.
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
This paper uses data from interviews with representatives of national and state organisations that have a policy interest in student-working in Australia. The interviewees included representatives from employer bodies and trade unions as well as government organisations. The data are used to discuss these stakeholders’ perceptions of the main advantages and disadvantages of working by young full-time students and the ways in which organisations in the business and educational sectors have adapted their policies and practices for student-working. The analysis is then used to inform a discussion about whether this is a legitimate area for public policy formulation and if so, what principles might underpin such policy and what some policies might look like.
Resumo:
Background Transmission of Plasmodium vivax malaria is dependent on vector availability, biting rates and parasite development. In turn, each of these is influenced by climatic conditions. Correlations have previously been detected between seasonal rainfall, temperature and malaria incidence patterns in various settings. An understanding of seasonal patterns of malaria, and their weather drivers, can provide vital information for control and elimination activities. This research aimed to describe temporal patterns in malaria, rainfall and temperature, and to examine the relationships between these variables within four counties of Yunnan Province, China. Methods Plasmodium vivax malaria surveillance data (1991–2006), and average monthly temperature and rainfall were acquired. Seasonal trend decomposition was used to examine secular trends and seasonal patterns in malaria. Distributed lag non-linear models were used to estimate the weather drivers of malaria seasonality, including the lag periods between weather conditions and malaria incidence. Results There was a declining trend in malaria incidence in all four counties. Increasing temperature resulted in increased malaria risk in all four areas and increasing rainfall resulted in increased malaria risk in one area and decreased malaria risk in one area. The lag times for these associations varied between areas. Conclusions The differences detected between the four counties highlight the need for local understanding of seasonal patterns of malaria and its climatic drivers.
Resumo:
Objective: To examine the space-time clustering of dengue fever (DF) transmission in Bangladesh using geographical information system and spatial scan statistics (SaTScan). Methods: We obtained data on monthly suspected DF cases and deaths by district in Bangladesh for the period of 2000–2009 from Directorate General of Health Services. Population and district boundary data of each district were collected from national census managed by Bangladesh Bureau of Statistics. To identify the space-time clusters of DF transmission a discrete Poisson model was performed using SaTScan software. Results: Space-time distribution of DF transmission was clustered during three periods 2000–2002, 2003–2005 and 2006–2009. Dhaka was the most likely cluster for DF in all three periods. Several other districts were significant secondary clusters. However, the geographical range of DF transmission appears to have declined in Bangladesh over the last decade. Conclusion: There were significant space-time clusters of DF in Bangladesh over the last decade. Our results would prompt future studies to explore how social and ecological factors may affect DF transmission and would also be useful for improving DF control and prevention programs in Bangladesh.
Resumo:
Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.
Resumo:
Travel time estimation and prediction on motorways has long been a topic of research. Prediction modeling generally assumes that the estimation is perfect. No matter how good is the prediction modeling- the errors in estimation can significantly deteriorate the accuracy and reliability of the prediction. Models have been proposed to estimate travel time from loop detector data. Generally, detectors are closely spaced (say 500m) and travel time can be estimated accurately. However, detectors are not always perfect, and even during normal running conditions few detectors malfunction, resulting in increase in the spacing between the functional detectors. Under such conditions, error in the travel time estimation is significantly large and generally unacceptable. This research evaluates the in-practice travel time estimation model during different traffic conditions. It is observed that the existing models fail to accurately estimate travel time during large detector spacing and congestion shoulder periods. Addressing this issue, an innovative Hybrid model that only considers loop data for travel time estimation is proposed. The model is tested using simulation and is validated with real Bluetooth data from Pacific Motorway Brisbane. Results indicate that during non free flow conditions and larger detector spacing Hybrid model provides significant improvement in the accuracy of travel time estimation.
Resumo:
The usage of the mobile Internet has increased tremendously within the last couple of years, and thereby the vision of accessing information anytime, anywhere has become more realistic and a dominant design principle for providing content. However, this study challenges this paradigm of unlimited and unrestricted access, and explores the question whether constraints and restrictions can positively influence the motivation and enticement of mobile users to engage with location-specific content. Restrictions, such as a particular time or location that gives a user access to content, may be used to foster participation and engagement, as well as to support content production and to enhance the user’s experience. In order to explore this, a Mobile Narrative and a Narrative Map have been created. For the former, the access to individual chapters of the story was restricted. Authors can specify constraints, such as a location or time, which need to be met by the reader if they want to read the story. This concept allows creative writers of the story to exploit the fact that the reader’s context is known, by intensifying the user experience and integrating this knowledge into the writing process. The latter, the Narrative Map, provides users with extracts from stories or information snippets about authors at relevant locations. In both concepts, a feedback channel was also integrated, on which location, time, and size constraints were imposed. In a user-centred design process involving authors and potential readers, those concepts have been implemented, followed by an evaluation comprising four user studies. The results show that restrictions and constraints can indeed lead to more enticing and engaging user experiences, and restricted contribution opportunities can lead to a higher motivation to participate as well as to an improved quality of submissions. These findings are relevant for future developments in the area of mobile narratives and creative writing, as well as for common mobile services that aim for enticing user experiences.
Resumo:
Guerrilla theatre tends, by its very definition, to pop up unpredictably – it interrupts what people might see as the proper or typical flow of time, place and space. The subversive tenor of such work means that questions about ‘what has happened’ tend to the decidedly less polite form of ‘WTF’ as passersby struggle to make sense of, and move on from, moments in which accustomed narratives of action and interaction no longer apply. In this paper I examine examples of guerrilla theatre by performers with disabilities in terms of these ruptures in time, and the way they prompt reflection, reconfigure relations, or recede into traditional relations again - focusing particularly on comedian Laurence Clark. Many performers with disabilities – Bill Shannon, Katherine Araniello, Aaron Williamson, Ju Gosling, and others – find guerrilla-style interventions in public places apposite to their aesthetic and political agendas. They prompt passersby to reflect on their relationship to people with disabilities. They can be recorded for later dissection and display, teaching people something about the way social performers, social spectators and society as a whole deal with disability. In this paper, as I unpack Clark's work, I note that the embarrassment that characterises these encounters can be a flag of an ethical process taking place for passersby. Caught between two moments in which time, roles and relationships suddenly fail to flow along the smooth routes of socially determined habits, passersbys’ frowns, gasps and giggles flag difficulties dealing with questions about their attitude to disabled people they do not now know how to answer. I consider the productivity, politics and performerly ethics of drawing passersby into such a process – a chaotic, challenging interstitial time in which a passersbys choices become fodder for public consumption – in such a wholly public way.
Resumo:
The numerical solution in one space dimension of advection--reaction--diffusion systems with nonlinear source terms may invoke a high computational cost when the presently available methods are used. Numerous examples of finite volume schemes with high order spatial discretisations together with various techniques for the approximation of the advection term can be found in the literature. Almost all such techniques result in a nonlinear system of equations as a consequence of the finite volume discretisation especially when there are nonlinear source terms in the associated partial differential equation models. This work introduces a new technique that avoids having such nonlinear systems of equations generated by the spatial discretisation process when nonlinear source terms in the model equations can be expanded in positive powers of the dependent function of interest. The basis of this method is a new linearisation technique for the temporal integration of the nonlinear source terms as a supplementation of a more typical finite volume method. The resulting linear system of equations is shown to be both accurate and significantly faster than methods that necessitate the use of solvers for nonlinear system of equations.
Resumo:
The acceptance of broadband ultrasound attenuation for the assessment of osteoporosis suffers from a limited understanding of ultrasound wave propagation through cancellous bone. It has recently been proposed that the ultrasound wave propagation can be described by a concept of parallel sonic rays. This concept approximates the detected transmission signal to be the superposition of all sonic rays that travel directly from transmitting to receiving transducer. The transit time of each ray is defined by the proportion of bone and marrow propagated. An ultrasound transit time spectrum describes the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit times over the surface of the receiving ultrasound transducer. The aim of this study was to provide a proof of concept that a transit time spectrum may be derived from digital deconvolution of input and output ultrasound signals. We have applied the active-set method deconvolution algorithm to determine the ultrasound transit time spectra in the three orthogonal directions of four cancellous bone replica samples and have compared experimental data with the prediction from the computer simulation. The agreement between experimental and predicted ultrasound transit time spectrum analyses derived from Bland–Altman analysis ranged from 92% to 99%, thereby supporting the concept of parallel sonic rays for ultrasound propagation in cancellous bone. In addition to further validation of the parallel sonic ray concept, this technique offers the opportunity to consider quantitative characterisation of the material and structural properties of cancellous bone, not previously available utilising ultrasound.
Resumo:
The approach adopted for investigating the relationship between rainfall characteristics and pollutant wash-off process is commonly based on the use of parameters which represent the entire rainfall event. This does not permit the investigation of the influence of rainfall characteristics on different sectors of the wash-off process such as first flush where there is a high pollutant wash-off load at the initial stage of the runoff event. This research study analysed the influence of rainfall characteristics on the pollutant wash-off process using two sets of innovative parameters by partitioning wash-off and rainfall characteristics. It was found that the initial 10% of the wash-off process is closely linked to runoff volume related rainfall parameters including rainfall depth and rainfall duration while the remaining part of the wash-off process is primarily influenced by kinetic energy related rainfall parameters, namely, rainfall intensity. These outcomes prove that different sectors of the wash-off process are influenced by different segments of a rainfall event.
Resumo:
WHENEVER I talk to my students about the requisites for writing, I always tell them that they need at least two things: space and time. Time, which we frequently describe through verbs of motion such as ‘flow’ or ‘flux’, and space, which we usually view as emptiness or the absence of matter. I.e., two dimensions, which are co-dependent, are not only features of the physical world but mental constructs that are elementary to the faculty of cognition...
Resumo:
Real-time image analysis and classification onboard robotic marine vehicles, such as AUVs, is a key step in the realisation of adaptive mission planning for large-scale habitat mapping in previously unexplored environments. This paper describes a novel technique to train, process, and classify images collected onboard an AUV used in relatively shallow waters with poor visibility and non-uniform lighting. The approach utilises Förstner feature detectors and Laws texture energy masks for image characterisation, and a bag of words approach for feature recognition. To improve classification performance we propose a usefulness gain to learn the importance of each histogram component for each class. Experimental results illustrate the performance of the system in characterisation of a variety of marine habitats and its ability to operate onboard an AUV's main processor suitable for real-time mission planning.
A finite volume method for solving the two-sided time-space fractional advection-dispersion equation
Resumo:
We present a finite volume method to solve the time-space two-sided fractional advection-dispersion equation on a one-dimensional domain. The spatial discretisation employs fractionally-shifted Grünwald formulas to discretise the Riemann-Liouville fractional derivatives at control volume faces in terms of function values at the nodes. We demonstrate how the finite volume formulation provides a natural, convenient and accurate means of discretising this equation in conservative form, compared to using a conventional finite difference approach. Results of numerical experiments are presented to demonstrate the effectiveness of the approach.