821 resultados para Grid-based clustering approach
Resumo:
To establish itself within the host system, Mycobacterium tuberculosis (Mtb) has formulated various means of attacking the host system. One such crucial strategy is the exploitation of the iron resources of the host system. Obtaining and maintaining the required concentration of iron becomes a matter of contest between the host and the pathogen, both trying to achieve this through complex molecular networks. The extent of complexity makes it important to obtain a systems perspective of the interplay between the host and the pathogen with respect to iron homeostasis. We have reconstructed a systems model comprising 92 components and 85 protein-protein or protein-metabolite interactions, which have been captured as a set of 194 rules. Apart from the interactions, these rules also account for protein synthesis and decay, RBC circulation and bacterial production and death rates. We have used a rule-based modelling approach, Kappa, to simulate the system separately under infection and non-infection conditions. Various perturbations including knock-outs and dual perturbation were also carried out to monitor the behavioral change of important proteins and metabolites. From this, key components as well as the required controlling factors in the model that are critical for maintaining iron homeostasis were identified. The model is able to re-establish the importance of iron-dependent regulator (ideR) in Mtb and transferrin (Tf) in the host. Perturbations, where iron storage is increased, appear to enhance nutritional immunity and the analysis indicates how they can be harmful for the host. Instead, decreasing the rate of iron uptake by Tf may prove to be helpful. Simulation and perturbation studies help in identifying Tf as a possible drug target. Regulating the mycobactin (myB) concentration was also identified as a possible strategy to control bacterial growth. The simulations thus provide significant insight into iron homeostasis and also for identifying possible drug targets for tuberculosis.
Resumo:
Applications in various domains often lead to very large and frequently high-dimensional data. Successful algorithms must avoid the curse of dimensionality but at the same time should be computationally efficient. Finding useful patterns in large datasets has attracted considerable interest recently. The primary goal of the paper is to implement an efficient Hybrid Tree based clustering method based on CF-Tree and KD-Tree, and combine the clustering methods with KNN-Classification. The implementation of the algorithm involves many issues like good accuracy, less space and less time. We will evaluate the time and space efficiency, data input order sensitivity, and clustering quality through several experiments.
Resumo:
This paper presents a GPU implementation of normalized cuts for road extraction problem using panchromatic satellite imagery. The roads have been extracted in three stages namely pre-processing, image segmentation and post-processing. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, vegetation,. and fallow regions). The road regions are then extracted using the normalized cuts algorithm. Normalized cuts algorithm is a graph-based partitioning `approach whose focus lies in extracting the global impression (perceptual grouping) of an image rather than local features. For the segmented image, post-processing is carried out using morphological operations - erosion and dilation. Finally, the road extracted image is overlaid on the original image. Here, a GPGPU (General Purpose Graphical Processing Unit) approach has been adopted to implement the same algorithm on the GPU for fast processing. A performance comparison of this proposed GPU implementation of normalized cuts algorithm with the earlier algorithm (CPU implementation) is presented. From the results, we conclude that the computational improvement in terms of time as the size of image increases for the proposed GPU implementation of normalized cuts. Also, a qualitative and quantitative assessment of the segmentation results has been projected.
Resumo:
Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application's throughput. In this paper we propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based lookahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from Amazon AWS IaaS public cloud. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.
Resumo:
The paper traces the different management practices adopted for Nigerian inland water bodies from the Colonial era to independence. It observes that the full potentials of these waters have never been realized over the years due to the absence of an effective management. The replacement of the traditional fisheries management by the centralized top-down approach by government after independence has not helped matters. Lately, the cooperative/community-based management approach has taken the centre stage worldwide. This has been identified to offer the most viable and equitable option towards the attainment of an optimum utilization of the fisheries resource. The entire community sensing security of tenure and enjoying some of the benefits from access control will actively take responsibility and enforcement. The paper drew experiences from some water bodies in Bangladesh, Philippines, Benin Republic and Malawi showing sound management strategy that, if adopted for our small and medium size reservoirs and other water bodies, would help optimize on an sustainable manner the benefits from those water bodies
Resumo:
Players cooperate in experiments more than game theory would predict. We introduce the ‘returns-based beliefs’ approach: the expected returns of a particular strategy in proportion to total expected returns of all strategies. Using a decision analytic solution concept, Luce’s (1959) probabilistic choice model, and ‘hyperpriors’ for ambiguity in players’ cooperability, our approach explains empirical observations in various classes of games including the Prisoner’s and Traveler’s Dilemmas. Testing the closeness of fit of our model on Selten and Chmura (2008) data for completely mixed 2 × 2 games shows that with loss aversion, returns-based beliefs explain the data better than other equilibrium concepts.
Resumo:
POMDP algorithms have made significant progress in recent years by allowing practitioners to find good solutions to increasingly large problems. Most approaches (including point-based and policy iteration techniques) operate by refining a lower bound of the optimal value function. Several approaches (e.g., HSVI2, SARSOP, grid-based approaches and online forward search) also refine an upper bound. However, approximating the optimal value function by an upper bound is computationally expensive and therefore tightness is often sacrificed to improve efficiency (e.g., sawtooth approximation). In this paper, we describe a new approach to efficiently compute tighter bounds by i) conducting a prioritized breadth first search over the reachable beliefs, ii) propagating upper bound improvements with an augmented POMDP and iii) using exact linear programming (instead of the sawtooth approximation) for upper bound interpolation. As a result, we can represent the bounds more compactly and significantly reduce the gap between upper and lower bounds on several benchmark problems. Copyright © 2011, Association for the Advancement of Artificial Intelligence. All rights reserved.
Resumo:
Engineering changes (ECs) are essential in complex product development, and their management is a crucial discipline for engineering industries. Numerous methods have been developed to support EC management (ECM), of which the change prediction method (CPM) is one of the most established. This article contributes a requirements-based benchmarking approach to assess and improve existing methods. The CPM is selected to be improved. First, based on a comprehensive literature survey and insights from industrial case studies, a set of 25 requirements for change management methods are developed. Second, these requirements are used as benchmarking criteria to assess the CPM in comparison to seven other promising methods. Third, the best-in-class solutions for each requirement are investigated to draw improvement suggestions for the CPM. Finally, an enhanced ECM method which implements these improvements is presented. © 2013 © 2013 The Author(s). Published by Taylor & Francis.
Resumo:
Cinnabar, an important traditional Chinese mineral medicine, has been widely used as a Chinese patent medicine ingredient for sedative therapy. However, the pharmaceutical and toxicological effects of cinnabar, especially in the whole organism, were subjected to few investigations. In this study, an NMR-based metabolomics approach has been applied to investigate the toxicological effects of cinnabar after intragastrical administration (dosed at 0.5, 2 and 5 g/kg body weight) on male Wistar rats.
Resumo:
This dissertation presents a series of irregular-grid based numerical technique for modeling seismic wave propagation in heterogeneous media. The study involves the generation of the irregular numerical mesh corresponding to the irregular grid scheme, the discretized version of motion equations under the unstructured mesh, and irregular-grid absorbing boundary conditions. The resulting numerical technique has been used in generating the synthetic data sets on the realistic complex geologic models that can examine the migration schemes. The motion equation discretization and modeling are based on Grid Method. The key idea is to use the integral equilibrium principle to replace the operator at each grid in Finite Difference scheme and variational formulation in Finite Element Method. The irregular grids of complex geologic model is generated by the Paving Method, which allow varying grid spacing according to meshing constraints. The grids have great quality at domain boundaries and contain equal quantities of nodes at interfaces, which avoids the interpolation of parameters and variables. The irregular grid absorbing boundary conditions is developed by extending the Perfectly Matched Layer method to the rotated local coordinates. The splitted PML equations of the first-order system is derived by using integral equilibrium principle. The proposed scheme can build PML boundary of arbitrary geometry in the computational domain, avoiding the special treatment at corners in a standard PML method and saving considerable memory and computation cost. The numerical implementation demonstrates the desired qualities of irregular grid based modeling technique. In particular, (1) smaller memory requirements and computational time are needed by changing the grid spacing according to local velocity; (2) Arbitrary surfaces and interface topographies are described accurately, thus removing the artificial reflection resulting from the stair approximation of the curved or dipping interfaces; (3) computational domain is significantly reduced by flexibly building the curved artificial boundaries using the irregular-grid absorbing boundary conditions. The proposed irregular grid approach is apply to reverse time migration as the extrapolation algorithm. It can discretize the smoothed velocity model by irregular grid of variable scale, which contributes to reduce the computation cost. The topography. It can also handle data set of arbitrary topography and no field correction is needed.
Resumo:
A new approach is proposed for clustering time-series data. The approach can be used to discover groupings of similar object motions that were observed in a video collection. A finite mixture of hidden Markov models (HMMs) is fitted to the motion data using the expectation-maximization (EM) framework. Previous approaches for HMM-based clustering employ a k-means formulation, where each sequence is assigned to only a single HMM. In contrast, the formulation presented in this paper allows each sequence to belong to more than a single HMM with some probability, and the hard decision about the sequence class membership can be deferred until a later time when such a decision is required. Experiments with simulated data demonstrate the benefit of using this EM-based approach when there is more "overlap" in the processes generating the data. Experiments with real data show the promising potential of HMM-based motion clustering in a number of applications.
Resumo:
This work proceeds from the assumption that a European environmental information and communication system (EEICS) is already established. In the context of primary users (land-use planners, conservationists, and environmental researchers) we ask what use may be made of the EEICS for building models and tools which is of use in building decision support systems for the land-use planner. The complex task facing the next generation of environmental and forest modellers is described, and a range of relevant modelling approaches are reviewed. These include visualization and GIS; statistical tabulation and database SQL, MDA and OLAP methods. The major problem of noncomparability of the definitions and measures of forest area and timber volume is introduced and the possibility of a model-based solution is considered. The possibility of using an ambitious and challenging biogeochemical modelling approach to understanding and managing European forests sustainably is discussed. It is emphasised that all modern methodological disciplines must be brought to bear, and a heuristic hybrid modelling approach should be used so as to ensure that the benefits of practical empirical modelling approaches are utilised in addition to the scientifically well-founded and holistic ecosystem and environmental modelling. The data and information system required is likely to end up as a grid-based-framework because of the heavy use of computationally intensive model-based facilities.
Resumo:
We address the problem of non-linearity in 2D Shape modelling of a particular articulated object: the human body. This issue is partially resolved by applying a different Point Distribution Model (PDM) depending on the viewpoint. The remaining non-linearity is solved by using Gaussian Mixture Models (GMM). A dynamic-based clustering is proposed and carried out in the Pose Eigenspace. A fundamental question when clustering is to determine the optimal number of clusters. From our point of view, the main aspect to be evaluated is the mean gaussianity. This partitioning is then used to fit a GMM to each one of the view-based PDM, derived from a database of Silhouettes and Skeletons. Dynamic correspondences are then obtained between gaussian models of the 4 mixtures. Finally, we compare this approach with other two methods we previously developed to cope with non-linearity: Nearest Neighbor (NN) Classifier and Independent Component Analysis (ICA).
Resumo:
We report on the migration of a traditional, single architecture application to a grid application using heterogeneous resources. We focus on the use of the UK e-Science Level 2 grid (UKL2G) which provides a heterogeneous collection of resources distributed within the UK. We discuss the solution architecture, the performance of our application, its future development as a grid-based application and comment on the lessons we have learned in using a grid infrastructure for large-scale numerical problems.
Resumo:
A study was conducted to investigate the sediment health and water quality of the River Sagana, Kenya, as impacted by the local tanning industry. Chemical analysis identified the main chemical pollutants (pentachlorophenols and chromium) while a bioassay addressed pollutant bioavailability. The bioassay, exploiting the luminescence response of a lux marked bacterial biosensor, was coupled to a dehydrogenase and Dapnia magna test to determine toxicity effects on sediments. Results highlighted the toxicity of the tannery effluent to the sediments at the point of discharge (64% of control bioluminescence) with gradual improvement downstream. There was a significant increase in dehydrogenase downstream, with the enzyme activity attaining a peak at 600 m, also indicating a gradual reduction of toxicity. Biological oxygen demand (19.56 mg L(-1)) dissolved oxygen (3.97 mg L(-1)) and high lethal dose value (85%) of D. magna also confirmed an initial stress at the point of discharge and recovery downstream. Optical density of surface water demonstrated an increase in suspended particulates and colour after the discharge point, eventually decreasing beyond 400 m. In conclusion, the study highlighted the importance of understanding the biogeochemistry of river systems impacted by industries discharging effluent into them and the invaluable role of a biosensor-based ecotoxicological approach to address effluent hazards, particularly in relation to river sediments.