53 resultados para Information Science
Resumo:
To provide in-time reactions to a large volume of surveil- lance data, uncertainty-enabled event reasoning frameworks for CCTV and sensor based intelligent surveillance system have been integrated to model and infer events of interest. However, most of the existing works do not consider decision making under uncertainty which is important for surveillance operators. In this paper, we extend an event reasoning framework for decision support, which enables our framework to predict, rank and alarm threats from multiple heterogeneous sources.
Resumo:
In modern semiconductor manufacturing facilities maintenance strategies are increasingly shifting from traditional preventive maintenance (PM) based approaches to more efficient and sustainable predictive maintenance (PdM) approaches. This paper describes the development of such an online PdM module for the endpoint detection system of an ion beam etch tool in semiconductor manufacturing. The developed system uses optical emission spectroscopy (OES) data from the endpoint detection system to estimate the RUL of lenses, a key detector component that degrades over time. Simulation studies for historical data for the use case demonstrate the effectiveness of the proposed PdM solution and the potential for improved sustainability that it affords.
Resumo:
There is now a strong body of research that suggests that the form of the built environment can influence levels of physical activity, leading to an increasing interest in incorporating health objectives into spatial planning and regeneration policies and projects. There have been a number of strands to this research, one of which has sought to develop “objective” measurements of the built environment using Geographic Information Science (GIS) involving measures of connectivity and proximity to compare the relative “walkability” of different neighbourhoods. The development of the “walkability index” (e.g. Leslie et al 2007, Frank et al 2010) has become a popular indicator of spatial distribution of those features of the built environment that are considered to have the greatest positive influence on levels of physical activity. The success of this measure is built on its ability to succinctly capture built environment correlates of physical activity using routinely available spatial data, which includes using road centre lines as a basis of a proxy for connectivity.
This paper discusses two key aspects of the walkability index. First, it follows the suggestion of Chin et al (2008) that the use of a footpath network (where available), rather than road centre lines, may be far more effective in evaluating walkability. This may be particularly important for assessing changes in walkability arising from pedestrian-focused infrastructure projects, such as greenways. Second, the paper explores the implication of this for how connectivity can be measured. The paper takes six different measures of connectivity and first analyses the relationships between them and then tests their correlation with actual levels of physical activity of local residents in Belfast, Northern Ireland. The analysis finds that the best measurements appear to be intersection density and metric reach and uses this finding to discuss the implications of this for developing tools that may better support decision-making in spatial planning.
Resumo:
The increasing popularity of the social networking service, Twitter, has made it more involved in day-to-day communications, strengthening social relationships and information dissemination. Conversations on Twitter are now being explored as indicators within early warning systems to alert of imminent natural disasters such earthquakes and aid prompt emergency responses to crime. Producers are privileged to have limitless access to market perception from consumer comments on social media and microblogs. Targeted advertising can be made more effective based on user profile information such as demography, interests and location. While these applications have proven beneficial, the ability to effectively infer the location of Twitter users has even more immense value. However, accurately identifying where a message originated from or author’s location remains a challenge thus essentially driving research in that regard. In this paper, we survey a range of techniques applied to infer the location of Twitter users from inception to state-of-the-art. We find significant improvements over time in the granularity levels and better accuracy with results driven by refinements to algorithms and inclusion of more spatial features.
Resumo:
Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.
Resumo:
Online information seeking has become normative practice among both academics and the general population. This study appraised the performance of eight databases to retrieve research pertaining to the influence of social networking sites on the mental health of young people. A total of 43 empirical studies on young people’s use of social networking sites and the mental health implications were retrieved. Scopus and SSCI had the highest sensitivity with PsycINFO having the highest precision. Effective searching requires large
generic databases, supplemented by subject-specific catalogues. The methodology developed here may provide inexperienced searchers, such as undergraduate students, with a framework to define a realistic scale of searching to undertake for a particular literature review or similar project.