77 resultados para Probabilistic logic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05A degrees x 0.05A degrees. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective sharing of the last level cache has a significant influence on the overall performance of a multicore system. We observe that existing solutions control cache occupancy at a coarser granularity, do not scale well to large core counts and in some cases lack the flexibility to support a variety of performance goals. In this paper, we propose Probabilistic Shared Cache Management (PriSM), a framework to manage the cache occupancy of different cores at cache block granularity by controlling their eviction probabilities. The proposed framework requires only simple hardware changes to implement, can scale to larger core count and is flexible enough to support a variety of performance goals. We demonstrate the flexibility of PriSM, by computing the eviction probabilities needed to achieve goals like hit-maximization, fairness and QOS. PriSM-HitMax improves performance by 18.7% over LRU and 11.8% over previously proposed schemes in a sixteen core machine. PriSM-Fairness improves fairness over existing solutions by 23.3% along with a performance improvement of 19.0%. PriSM-QOS successfully achieves the desired QOS targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, a method for the tuning the membership functions of a Mamdani type Fuzzy Logic Controller (FLC) using the Clonal Selection Algorithm(CSA) a model of the Artificial Immune System(AIS) paradigm is examined. FLC's are designed for two problems, firstly the linear cart centering problem and secondly the highly nonlinear inverted pendulum problem. The FLC tuned by AIS is compared with FLC tuned by GA. In order to check the robustness of the designed PLC's white noise was added to the system, further, the masses of the cart and the length and mass of the pendulum are changed. The PLC's were also tested in the presence of faulty rules. Finally, Kruskal Wallis test was performed to compare the performance of the GA and AIS. An insight into the algorithms are also given by studying the effect of the important parameters of GA and AIS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hollow microcapsules capable of disintegrating in response to dual biological stimuli have been synthesized from two FDA approved drug molecules. The capsules fabricated from protamine and chondroitin sulphate disintegrate in the presence of either trypsin or hyaluronidase enzymes, which are documented to be simultaneously over-expressed under some pathological conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crop type classification using remote sensing data plays a vital role in planning cultivation activities and for optimal usage of the available fertile land. Thus a reliable and precise classification of agricultural crops can help improve agricultural productivity. Hence in this paper a gene expression programming based fuzzy logic approach for multiclass crop classification using Multispectral satellite image is proposed. The purpose of this work is to utilize the optimization capabilities of GEP for tuning the fuzzy membership functions. The capabilities of GEP as a classifier is also studied. The proposed method is compared to Bayesian and Maximum likelihood classifier in terms of performance evaluation. From the results we can conclude that the proposed method is effective for classification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Latent variable methods, such as PLCA (Probabilistic Latent Component Analysis) have been successfully used for analysis of non-negative signal representations. In this paper, we formulate PLCS (Probabilistic Latent Component Segmentation), which models each time frame of a spectrogram as a spectral distribution. Given the signal spectrogram, the segmentation boundaries are estimated using a maximum-likelihood approach. For an efficient solution, the algorithm imposes a hard constraint that each segment is modelled by a single latent component. The hard constraint facilitates the solution of ML boundary estimation using dynamic programming. The PLCS framework does not impose a parametric assumption unlike earlier ML segmentation techniques. PLCS can be naturally extended to model coarticulation between successive phones. Experiments on the TIMIT corpus show that the proposed technique is promising compared to most state of the art speech segmentation algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating program worst case execution time(WCET) accurately and efficiently is a challenging task. Several programs exhibit phase behavior wherein cycles per instruction (CPI) varies in phases during execution. Recent work has suggested the use of phases in such programs to estimate WCET with minimal instrumentation. However the suggested model uses a function of mean CPI that has no probabilistic guarantees. We propose to use Chebyshev's inequality that can be applied to any arbitrary distribution of CPI samples, to probabilistically bound CPI of a phase. Applying Chebyshev's inequality to phases that exhibit high CPI variation leads to pessimistic upper bounds. We propose a mechanism that refines such phases into sub-phases based on program counter(PC) signatures collected using profiling and also allows the user to control variance of CPI within a sub-phase. We describe a WCET analyzer built on these lines and evaluate it with standard WCET and embedded benchmark suites on two different architectures for three chosen probabilities, p={0.9, 0.95 and 0.99}. For p= 0.99, refinement based on PC signatures alone, reduces average pessimism of WCET estimate by 36%(77%) on Arch1 (Arch2). Compared to Chronos, an open source static WCET analyzer, the average improvement in estimates obtained by refinement is 5%(125%) on Arch1 (Arch2). On limiting variance of CPI within a sub-phase to {50%, 10%, 5% and 1%} of its original value, average accuracy of WCET estimate improves further to {9%, 11%, 12% and 13%} respectively, on Arch1. On Arch2, average accuracy of WCET improves to 159% when CPI variance is limited to 50% of its original value and improvement is marginal beyond that point.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a novel approach to solve the ordinal regression problem using Gaussian processes. The proposed approach, probabilistic least squares ordinal regression (PLSOR), obtains the probability distribution over ordinal labels using a particular likelihood function. It performs model selection (hyperparameter optimization) using the leave-one-out cross-validation (LOO-CV) technique. PLSOR has conceptual simplicity and ease of implementation of least squares approach. Unlike the existing Gaussian process ordinal regression (GPOR) approaches, PLSOR does not use any approximation techniques for inference. We compare the proposed approach with the state-of-the-art GPOR approaches on some synthetic and benchmark data sets. Experimental results show the competitiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yaw rate of a vehicle is highly influenced by the lateral forces generated at the tire contact patch to attain the desired lateral acceleration, and/or by external disturbances resulting from factors such as crosswinds, flat tire or, split-μ braking. The presence of the latter and the insufficiency of the former may lead to undesired yaw motion of a vehicle. This paper proposes a steer-by-wire system based on fuzzy logic as yaw-stability controller for a four-wheeled road vehicle with active front steering. The dynamics governing the yaw behavior of the vehicle has been modeled in MATLAB/Simulink. The fuzzy controller receives the yaw rate error of the vehicle and the steering signal given by the driver as inputs and generates an additional steering angle as output which provides the corrective yaw moment. The results of simulations with various drive input signals show that the yaw stability controller using fuzzy logic proposed in the current study has a good performance in situations involving unexpected yaw motion. The yaw rate errors of a vehicle having the proposed controller are notably smaller than an uncontrolled vehicle's, and the vehicle having the yaw stability controller recovers lateral distance and desired yaw rate more quickly than the uncontrolled vehicle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper highlights the seismic microzonation carried out for a nuclear power plant site. Nuclear power plants are considered to be one of the most important and critical structures designed to withstand all natural disasters. Seismic microzonation is a process of demarcating a region into individual areas having different levels of various seismic hazards. This will help in identifying regions having high seismic hazard which is vital for engineering design and land-use planning. The main objective of this paper is to carry out the seismic microzonation of a nuclear power plant site situated in the east coast of South India, based on the spatial distribution of the hazard index value. The hazard index represents the consolidated effect of all major earthquake hazards and hazard influencing parameters. The present work will provide new directions for assessing the seismic hazards of new power plant sites in the country. Major seismic hazards considered for the evaluation of the hazard index are (1) intensity of ground shaking at bedrock, (2) site amplification, (3) liquefaction potential and (4) the predominant frequency of the earthquake motion at the surface. The intensity of ground shaking in terms of peak horizontal acceleration (PHA) was estimated for the study area using both deterministic and probabilistic approaches with logic tree methodology. The site characterization of the study area has been carried out using the multichannel analysis of surface waves test and available borehole data. One-dimensional ground response analysis was carried out at major locations within the study area for evaluating PHA and spectral accelerations at the ground surface. Based on the standard penetration test data, deterministic as well as probabilistic liquefaction hazard analysis has been carried out for the entire study area. Finally, all the major earthquake hazards estimated above, and other significant parameters representing local geology were integrated using the analytic hierarchy process and hazard index map for the study area was prepared. Maps showing the spatial variation of seismic hazards (intensity of ground shaking, liquefaction potential and predominant frequency) and hazard index are presented in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents the response of a vertically loaded pile in undrained clay considering spatially distributed undrained shear strength. The probabilistic study is performed considering undrained shear strength as random variable and the analysis is conducted using random field theory. The inherent soil variability is considered as source of variability and the field is modeled as two dimensional non-Gaussian homogeneous random field. Random field is simulated using Cholesky decomposition technique within the finite difference program and Monte Carlo simulation approach is considered for the probabilistic analysis. The influence of variance and spatial correlation of undrained shear strength on the ultimate capacity as summation of ultimate skin friction and end bearing resistance of pile are examined. It is observed that the coefficient of variation and spatial correlation distance are the most important parameters that affect the pile ultimate capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The work presented in this paper involves the stochastic finite element analysis of composite-epoxy adhesive lap joints using Monte Carlo simulation. A set of composite adhesive lap joints were prepared and loaded till failure to obtain their strength. The peel and shear strain in the bond line region at different levels of load were obtained using digital image correlation (DIC). The corresponding stresses were computed assuming a plane strain condition. The finite element model was verified by comparing the numerical and experimental stresses. The stresses exhibited a similar behavior and a good correlation was obtained. Further, the finite element model was used to perform the stochastic analysis using Monte Carlo simulation. The parameters influencing stress distribution were provided as a random input variable and the resulting probabilistic variation of maximum peel and shear stresses were studied. It was found that the adhesive modulus and bond line thickness had significant influence on the maximum stress variation. While the adherend thickness had a major influence, the effect of variation in longitudinal and shear modulus on the stresses was found to be little. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the early stages of operation, high-tech startups need to overcome the liability of newness and manage high degree of uncertainty. Several high-tech startups fail due to inability to deal with skeptical customers, underdeveloped markets and limited resources in selling an offering that has no precedent. This paper leverages the principles of effectuation (a logic of entrepreneurial decision making under uncertainty) to explain the journey from creation to survival of high-tech startups in an emerging economy. Based on the 99tests.com case study, this paper suggests that early stage high-tech startups in emerging economies can increase their probability of survival by adopting the principles of effectuation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a probabilistic prediction based approach for providing Quality of Service (QoS) to delay sensitive traffic for Internet of Things (IoT). A joint packet scheduling and dynamic bandwidth allocation scheme is proposed to provide service differentiation and preferential treatment to delay sensitive traffic. The scheduler focuses on reducing the waiting time of high priority delay sensitive services in the queue and simultaneously keeping the waiting time of other services within tolerable limits. The scheme uses the difference in probability of average queue length of high priority packets at previous cycle and current cycle to determine the probability of average weight required in the current cycle. This offers optimized bandwidth allocation to all the services by avoiding distribution of excess resources for high priority services and yet guaranteeing the services for it. The performance of the algorithm is investigated using MPEG-4 traffic traces under different system loading. The results show the improved performance with respect to waiting time for scheduling high priority packets and simultaneously keeping tolerable limits for waiting time and packet loss for other services. Crown Copyright (C) 2015 Published by Elsevier B.V.