949 resultados para Geo-statistical model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background elimination models are widely used in motion tracking systems. Our aim is to develop a system that performs reliably under adverse lighting conditions. In particular, this includes indoor scenes lit partly or entirely by diffuse natural light. We present a modified "median value" model in which the detection threshold adapts to global changes in illumination. The responses of several models are compared, demonstrating the effectiveness of the new model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitric oxide is implicated in the pathogenesis of various neuropathologies characterized by oxidative stress. Although nitric oxide has been reported to be involved in the exacerbation of oxidative stress observed in several neuropathologies, existent data fail to provide a holistic description of how nitrergic pathobiology elicits neuronal injury. Here we provide a comprehensive description of mechanisms contributing to nitric oxide induced neuronal injury by global transcriptomic profiling. Microarray analyses were undertaken on RNA from murine primary cortical neurons treated with the nitric oxide generator DETA-NONOate (NOC-18, 0.5 mM) for 8–24 hrs. Biological pathway analysis focused upon 3672 gene probes which demonstrated at least a ±1.5-fold expression in a minimum of one out of three time-points and passed statistical analysis (one-way anova, P < 0.05). Numerous enriched processes potentially determining nitric oxide mediated neuronal injury were identified from the transcriptomic profile: cell death, developmental growth and survival, cell cycle, calcium ion homeostasis, endoplasmic reticulum stress, oxidative stress, mitochondrial homeostasis, ubiquitin-mediated proteolysis, and GSH and nitric oxide metabolism. Our detailed time-course study of nitric oxide induced neuronal injury allowed us to provide the first time a holistic description of the temporal sequence of cellular events contributing to nitrergic injury. These data form a foundation for the development of screening platforms and define targets for intervention in nitric oxide neuropathologies where nitric oxide mediated injury is causative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel framework for large-scale scene understanding in static camera surveillance. Our techniques combine fast rank-1 constrained robust PCA to compute the foreground, with non-parametric Bayesian models for inference. Clusters are extracted in foreground patterns using a joint multinomial+Gaussian Dirichlet process model (DPM). Since the multinomial distribution is normalized, the Gaussian mixture distinguishes between similar spatial patterns but different activity levels (eg. car vs bike). We propose a modification of the decayed MCMC technique for incremental inference, providing the ability to discover theoretically unlimited patterns in unbounded video streams. A promising by-product of our framework is online, abnormal activity detection. A benchmark video and two surveillance videos, with the longest being 140 hours long are used in our experiments. The patterns discovered are as informative as existing scene understanding algorithms. However, unlike existing work, we achieve near real-time execution and encouraging performance in abnormal activity detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous experience and research indicated that the Pareto Principle (80/20 Principle) has been widely used in many industries to achieve more with less. The study described in this paper concurs that this principle can be applied to improve the estimating accuracy and efficiency, especially in design development stage of projects. In fact, establishing an effective cost estimating model to improve accuracy and efficiency in design development stage has been a subject, which has attracted many research attentions over several decades. For over almost 40 years, research studies indicate that using the 80/20 Principle is one of the approaches. However, most of these studies were built by assumption, theoretical analysis or questionnaire survey. The objective of this research is to explore a logical and systematic method to establish a cost estimating model based on the Pareto Principle. This paper includes extensive literatures review on cost estimating accuracy and efficiency in the construction industry that points out the current gap of knowledge area and understanding of the topical. These reviews assist in developing the direction for the research and explore the potential methodology of using the Pareto Principle in the new cost estimating model. The findings of this paper suggest that combining the Pareto Principle with statistical analysis could be used as the technique to improve the accuracy and efficiency of current estimating methods in design development stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the implementation of machine learning techniques that assist cycling experts in the crucial decision-making processes for athlete selection and strategic planning in the track cycling omnium. The omnium is a multi-event competition that was included in the Olympic Games for the first time in 2012. Presently, selectors and cycling coaches make decisions based on experience and opinion. They rarely have access to knowledge that helps predict athletic performances. The omnium presents a unique and complex decision-making challenge as it is not clear what type of athlete is best suited to the omnium (e.g., sprint or endurance specialist) and tactical decisions made by the coach and athlete during the event will have significant effects on the overall performance of the athlete. In the present work, a variety of machine learning techniques were used to analyze omnium competition data from the World Championships since 2007. The analysis indicates that sprint events have slightly more influence in determining the medalists, than endurance-based events. Using a probabilistic analysis, we created a model of performance prediction that provides an unprecedented level of supporting information that assists coaches with strategic and tactical decisions during the omnium.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study on the pedestrian's steering behaviour through a built environment in normal circumstances is presented in this paper. The study focuses on the relationship between the environment and the pedestrian's walking trajectory. Owing to the ambiguity and vagueness of the relationship between the pedestrians and the surrounding environment, a genetic fuzzy system is proposed for modelling and simulation of the pedestrian's walking trajectory confronting the environmental stimuli. We apply the genetic algorithm to search for the optimum membership function parameters of the fuzzy model. The proposed system receives the pedestrian's perceived stimuli from the environment as the inputs, and provides the angular change of direction in each step as the output. The environmental stimuli are quantified using the Helbing social force model. Attractive and repulsive forces within the environment represent various environmental stimuli that influence the pedestrian's walking trajectory at each point of the space. To evaluate the effectiveness of the proposed model, three experiments are conducted. The first experimental results are validated against real walking trajectories of participants within a corridor. The second and third experimental results are validated against simulated walking trajectories collected from the AnyLogic® software. Analysis and statistical measurement of the results indicate that the genetic fuzzy system with optimised membership functions produces more accurate and stable prediction of heterogeneous pedestrians' walking trajectories than those from the original fuzzy model. © 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mobile eLearning (mLearning) can create a revolution in eLearning with the popularity of smart mobile devices and Application. However, contents are the king to make this revolution happen. Moreover, for an effective mLearning system, analytical aspects such as, quality of contents, quality of results, performance of learners, needs to be addressed. This paper presents a framework for personal mLearning. In this paper, we have used graph-based model called bipartite graph for content authentication and identification of the quality of results. Furthermore, we have used statistical estimation process for trustworthiness of weights in the bipartite graph using confidence interval and hypothesis test as analytical decision model tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a hybrid model consisting of the fuzzy ARTMAP (FAM) neural network and the classification and regression tree (CART) is formulated. FAM is useful for tackling the stability–plasticity dilemma pertaining to data-based learning systems, while CART is useful for depicting its learned knowledge explicitly in a tree structure. By combining the benefits of both models, FAM–CART is capable of learning data samples stably and, at the same time, explaining its predictions with a set of decision rules. In other words, FAM–CART possesses two important properties of an intelligent system, i.e., learning in a stable manner (by overcoming the stability–plasticity dilemma) and extracting useful explanatory rules (by overcoming the opaqueness issue). To evaluate the usefulness of FAM–CART, six benchmark medical data sets from the UCI repository of machine learning and a real-world medical data classification problem are used for evaluation. For performance comparison, a number of performance metrics which include accuracy, specificity, sensitivity, and the area under the receiver operation characteristic curve are computed. The results are quantified with statistical indicators and compared with those reported in the literature. The outcomes positively indicate that FAM–CART is effective for undertaking data classification tasks. In addition to producing good results, it provides justifications of the predictions in the form of a decision tree so that domain users can easily understand the predictions, therefore making it a useful decision support tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Educational campaigning has received little attention in the literature. This study investigates long-term and organised urban campaigns that are collectively lobbying the Victorian State Government in Australia, for a new public high school to be constructed in their suburb. A public high school is also known as a state school, government school, or an ordinary comprehensive school. It receives the majority of its funding from the State and Federal Australian Government, and is generally regarded as ‘free’ education, in comparison to a private school. Whilst the campaigners frame their requests as for a ‘public school’, their primary appeal is for a local school in their community. This study questions how collective campaigning for a locale-specific public school is influenced by geography, class and identity. In order to explore these campaigns, I draw on formative studies of middle-class school choice from an Australian and United Kingdom perspective (Campbell, Proctor, & Sherington, 2009; Reay, Crozier, & James, 2011). To think about the role of geography and space in these processes of choice, I look to apply Harvey’s (1973) theory of absolute, relational and relative space. I use Bourdieu (1999b) as a sociological lens that is attentive to “site effects” and it is through this lens that I think about class as a “collection of properties” (Bourdieu, 1984, p. 106), actualised via mechanisms of identity and representation (Hall, 1996; Rose, 1996a, 1996b). This study redresses three distinct gaps in the literature: first, I focus attention on a contemporary middle-class choice strategy—that is, collective campaigning for a public school. Research within this field is significantly under-developed, despite this choice strategy being on the rise. Second, previous research argues that certain middle-class choosers regard the local public school as “inferior” in some way (Reay, et al., 2011, p. 111), merely acting as a “safety net” (Campbell, et al., 2009, p. 5) and connected to the working-class chooser (Reay & Ball, 1997). The campaigners are characteristic of the middle-class school chooser, but they are purposefully and strategically seeking out the local public school. Therefore, this study looks to build on work by Reay, et al. (2011) in thinking about “against-the-grain school choice”, specifically within the Australian context. Third, this study uses visual and graphic methods in order to examine the influence of geography in the education market (Taylor, 2001). I see the visualisation of space and schooling that I offer in this dissertation as a key theoretical contribution of this study. I draw on a number of data sets, both qualitative and quantitative, to explore the research questions. I interviewed campaigners and attended campaign meetings as participant observer; I collected statistical data from fifteen different suburbs and schools, and conducted comparative analyses of each. These analyses are displayed by using visual graphs. This study uses maps created by a professional graphic designer and photographs by a professional photographer; I draw on publications by the campaigners themselves, such as surveys, reports and social media; but also, interviews with campaigners that are published in local or state newspapers. The multiple data sets enable an immersive and rich graphic ethnography. This study contributes by building on understandings of how particular sociological cohorts of choosers are engaging with, and choosing, the urban public school in Australia. It is relevant for policy making, in that it comes at a time of increasing privatisation and a move toward independent public schools. This study identifies cohorts of choosers that are employing individual and collective political strategies to obtain a specific school, and it identifies this cohort via explicit class-based characteristics and their school choice behaviours. I look to use fresh theoretical and methodological approaches that emphasise space and geography, theorising geo-identity and the pseudo-private school

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessing patterns of fisheries activity at a scale related to resource exploitation has received particular attention in recent times. However, acquiring data about the distribution and spatiotemporal allocation of catch and fishing effort in small scale benthic fisheries remains challenging. Here, we used GIS-based spatio-statistical models to investigate the footprint of commercial diving events on blacklip abalone (Haliotis rubra) stocks along the south-west coast of Victoria, Australia from 2008 to 2011. Using abalone catch data matched with GPS location we found catch per unit of fishing effort (CPUE) was not uniformly spatially and temporally distributed across the study area. Spatial autocorrelation and hotspot analysis revealed significant spatiotemporal clusters of CPUE (with distance thresholds of 100's of meters) among years, indicating the presence of CPUE hotspots focused on specific reefs. Cumulative hotspot maps indicated that certain reef complexes were consistently targeted across years but with varying intensity, however often a relatively small proportion of the full reef extent was targeted. Integrating CPUE with remotely-sensed light detection and ranging (LiDAR) derived bathymetry data using generalized additive mixed model corroborated that fishing pressure primarily coincided with shallow, rugose and complex components of reef structures. This study demonstrates that a geospatial approach is efficient in detecting patterns and trends in commercial fishing effort and its association with seafloor characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual notations are a key aspect of visual languages. They provide a direct mapping between the intended information and set of graphical symbols. Visual notations are most often implemented using the low level syntax of programming languages which is time consuming, error prone, difficult to maintain and hardly human-centric. In this paper we describe an alternative approach to generating visual notations using by-example model transformations. In our new approach, a semantic mapping between model and view is implemented using model transformations. The notations resulting from this approach can be reused by mapping varieties of input data to their model and can be composed into different visualizations. Our approach is implemented in the CONVErT framework and has been applied to many visualization examples. Three case studies for visualizing statistical charts, visualization of traffic data, and reuse of a Minard's map visualization's components, are presented in this paper. A detailed user study of our approach for reusing notations and generating visualizations has been provided. 80% of the participants in this user study agreed that the novel approach to visualization was easy and 87% stated that they quickly learned to use the tool support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual notations are a key aspect of visual languages. They provide a direct mapping between the intended information and set of graphical symbols. Visual notations are most often implemented using the low level syntax of programming languages which is time consuming, error prone, difficult to maintain and hardly human-centric. In this paper we describe an alternative approach to generating visual notations using byexample model transformations. In our new approach, a semantic mapping between model and view is implemented using model transformations. The notations resulting from this approach can be reused by mapping varieties of input data to their model and can be composed into different visualisations. Our approach is implemented in the CONVErT framework and has been applied to many visualisation examples. Two case studies for visualising statistical charts and visualisation of traffic data are presented in this paper. A detailed user study of our approach for reusing notations and generating visualisations has been provided that shows good reusability and general acceptance of the novel approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Depression is widely considered to be an independent and robust predictor of Coronary Heart Disease (CHD), however is seldom considered in the context of formal risk assessment. We assessed whether the addition of depression to the Framingham Risk Equation (FRE) improved accuracy for predicting 10-year CHD in a sample of women.

DESIGN: A prospective, longitudinal design comprising an age-stratified, population-based sample of Australian women collected between 1993 and 2011 (n=862).

METHODS: Clinical depressive disorder was assessed using the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders (SCID-I/NP), using retrospective age-of-onset data. A composite measure of CHD included non-fatal myocardial infarction, unstable angina coronary intervention or cardiac death. Cox proportional-hazards regression models were conducted and overall accuracy assessed using area under receiver operating characteristic (ROC) curve analysis.

RESULTS: ROC curve analyses revealed that the addition of baseline depression status to the FRE model improved its overall accuracy (AUC:0.77, Specificity:0.70, Sensitivity:0.75) when compared to the original FRE model (AUC:0.75, Specificity:0.73, Sensitivity:0.67). However, when calibrated against the original model, the predicted number of events generated by the augmented version marginally over-estimated the true number observed.

CONCLUSIONS: The addition of a depression variable to the FRE equation improves the overall accuracy of the model for predicting 10-year CHD events in women, however may over-estimate the number of events that actually occur. This model now requires validation in larger samples as it could form a new CHD risk equation for women.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stability in clinical prediction models is crucial for transferability between studies, yet has received little attention. The problem is paramount in high dimensional data, which invites sparse models with feature selection capability. We introduce an effective method to stabilize sparse Cox model of time-to-events using statistical and semantic structures inherent in Electronic Medical Records (EMR). Model estimation is stabilized using three feature graphs built from (i) Jaccard similarity among features (ii) aggregation of Jaccard similarity graph and a recently introduced semantic EMR graph (iii) Jaccard similarity among features transferred from a related cohort. Our experiments are conducted on two real world hospital datasets: a heart failure cohort and a diabetes cohort. On two stability measures – the Consistency index and signal-to-noise ratio (SNR) – the use of our proposed methods significantly increased feature stability when compared with the baselines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artificial neural network (ANN) models are able to predict future events based on current data. The usefulness of an ANN lies in the capacity of the model to learn and adjust the weights following previous errors during training. In this study, we carefully analyse the existing methods in neuronal spike sorting algorithms. The current methods use clustering as a basis to establish the ground truths, which requires tedious procedures pertaining to feature selection and evaluation of the selected features. Even so, the accuracy of clusters is still questionable. Here, we develop an ANN model to specially address the present drawbacks and major challenges in neuronal spike sorting. New enhancements are introduced into the conventional backpropagation ANN for determining the network weights, input nodes, target node, and error calculation. Coiflet modelling of noise is employed to enhance the spike shape features and overshadow noise. The ANN is used in conjunction with a special spiking event detection technique to prioritize the targets. The proposed enhancements are able to bolster the training concept, and on the whole, contributing to sorting neuronal spikes with close approximations.