950 resultados para false memories


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic analysis techniques have been proposed to detect potential deadlocks. Analyzing and comprehending each potential deadlock to determine whether the deadlock is feasible in a real execution requires significant programmer effort. Moreover, empirical evidence shows that existing analyses are quite imprecise. This imprecision of the analyses further void the manual effort invested in reasoning about non-existent defects. In this paper, we address the problems of imprecision of existing analyses and the subsequent manual effort necessary to reason about deadlocks. We propose a novel approach for deadlock detection by designing a dynamic analysis that intelligently leverages execution traces. To reduce the manual effort, we replay the program by making the execution follow a schedule derived based on the observed trace. For a real deadlock, its feasibility is automatically verified if the replay causes the execution to deadlock. We have implemented our approach as part of WOLF and have analyzed many large (upto 160KLoC) Java programs. Our experimental results show that we are able to identify 74% of the reported defects as true (or false) positives automatically leaving very few defects for manual analysis. The overhead of our approach is negligible making it a compelling tool for practical adoption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a low energy memory decoder architecture for ultra-low-voltage systems containing multiple voltage domains. Due to limitations in scalability of memory supply voltages, these systems typically contain a core operating at subthreshold voltages and memories operating at a higher voltage. This difference in voltage provides a timing slack on the memory path as the core supply is scaled. The paper analyzes the feasibility and trade-offs in utilizing this timing slack to operate a greater section of memory decoder circuitry at the lower supply. A 256x16-bit SRAM interface has been designed in UMC 65nm low-leakage process to evaluate the above technique with the core and memory operating at 280 mV and 500 mV respectively. The technique provides a reduction of up to 20% in energy/cycle of the row decoder without any penalty in area and system-delay.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, the hypothesis testing problem of spectrum sensing in a cognitive radio is formulated as a Goodness-of-fit test against the general class of noise distributions used in most communications-related applications. A simple, general, and powerful spectrum sensing technique based on the number of weighted zero-crossings in the observations is proposed. For the cases of uniform and exponential weights, an expression for computing the near-optimal detection threshold that meets a given false alarm probability constraint is obtained. The proposed detector is shown to be robust to two commonly encountered types of noise uncertainties, namely, the noise model uncertainty, where the PDF of the noise process is not completely known, and the noise parameter uncertainty, where the parameters associated with the noise PDF are either partially or completely unknown. Simulation results validate our analysis, and illustrate the performance benefits of the proposed technique relative to existing methods, especially in the low SNR regime and in the presence of noise uncertainties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The power of X-ray crystal structure analysis as a technique is to `see where the atoms are'. The results are extensively used by a wide variety of research communities. However, this `seeing where the atoms are' can give a false sense of security unless the precision of the placement of the atoms has been taken into account. Indeed, the presentation of bond distances and angles to a false precision (i.e. to too many decimal places) is commonplace. This article has three themes. Firstly, a basis for a proper representation of protein crystal structure results is detailed and demonstrated with respect to analyses of Protein Data Bank entries. The basis for establishing the precision of placement of each atom in a protein crystal structure is non-trivial. Secondly, a knowledge base harnessing such a descriptor of precision is presented. It is applied here to the case of salt bridges, i.e. ion pairs, in protein structures; this is the most fundamental place to start with such structure-precision representations since salt bridges are one of the tenets of protein structure stability. Ion pairs also play a central role in protein oligomerization, molecular recognition of ligands and substrates, allosteric regulation, domain motion and alpha-helix capping. A new knowledge base, SBPS (Salt Bridges in Protein Structures), takes these structural precisions into account and is the first of its kind. The third theme of the article is to indicate natural extensions of the need for such a description of precision, such as those involving metalloproteins and the determination of the protonation states of ionizable amino acids. Overall, it is also noted that this work and these examples are also relevant to protein three-dimensional structure molecular graphics software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Breast cancer is one of the leading cause of cancer related deaths in women and early detection is crucial for reducing mortality rates. In this paper, we present a novel and fully automated approach based on tissue transition analysis for lesion detection in breast ultrasound images. Every candidate pixel is classified as belonging to the lesion boundary, lesion interior or normal tissue based on its descriptor value. The tissue transitions are modeled using a Markov chain to estimate the likelihood of a candidate lesion region. Experimental evaluation on a clinical dataset of 135 images show that the proposed approach can achieve high sensitivity (95 %) with modest (3) false positives per image. The approach achieves very similar results (94 % for 3 false positives) on a completely different clinical dataset of 159 images without retraining, highlighting the robustness of the approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Glycated hemoglobin (HbA(1c)) is a `gold standard' biomarker for assessing the glycemic index of an individual. HbA(1c) is formed due to nonenzymatic glycosylation at N-terminal valine residue of the P-globin chain. Cation exchange based high performance liquid chromatography (CE HPLC) is mostly used to quantify HbA(1c), in blood sample. A few genetic variants of hemoglobin and post-translationally modified variants of hemoglobin interfere with CE HPLC-based quantification,. resulting in its false positive estimation. Using mass spectrometry, we analyzed a blood sample with abnormally high HbA(1c) (52.1%) in the CE HPLC method. The observed HbA(1c) did not corroborate the blood glucose level of the patient. A mass spectrometry based bottom up proteomics approach, intact globin chain mass analysis, and chemical modification of the proteolytic peptides identified the presence of Hb Beckman, a genetic variant of hemoglobin, in the experimental sample. A similar surface area to charge ratio between HbA(1c) and Hb Beckman might have resulted in the coelution of the variant with HbA(1c) in CE HPLC. Therefore, in the screening of diabetes mellitus through the estimation of HbA(1c), it is important to look for genetic variants of hemoglobin in samples that show abnormally high glycemic index, and HbA(1c) must be estimated using an alternative method. (C) 2015 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study presents a comprehensive evaluation of five widely used multisatellite precipitation estimates (MPEs) against 1 degrees x 1 degrees gridded rain gauge data set as ground truth over India. One decade observations are used to assess the performance of various MPEs (Climate Prediction Center (CPC)-South Asia data set, CPC Morphing Technique (CMORPH), Precipitation Estimation From Remotely Sensed Information Using Artificial Neural Networks, Tropical Rainfall Measuring Mission's Multisatellite Precipitation Analysis (TMPA-3B42), and Global Precipitation Climatology Project). All MPEs have high detection skills of rain with larger probability of detection (POD) and smaller ``missing'' values. However, the detection sensitivity differs from one product (and also one region) to the other. While the CMORPH has the lowest sensitivity of detecting rain, CPC shows highest sensitivity and often overdetects rain, as evidenced by large POD and false alarm ratio and small missing values. All MPEs show higher rain sensitivity over eastern India than western India. These differential sensitivities are found to alter the biases in rain amount differently. All MPEs show similar spatial patterns of seasonal rain bias and root-mean-square error, but their spatial variability across India is complex and pronounced. The MPEs overestimate the rainfall over the dry regions (northwest and southeast India) and severely underestimate over mountainous regions (west coast and northeast India), whereas the bias is relatively small over the core monsoon zone. Higher occurrence of virga rain due to subcloud evaporation and possible missing of small-scale convective events by gauges over the dry regions are the main reasons for the observed overestimation of rain by MPEs. The decomposed components of total bias show that the major part of overestimation is due to false precipitation. The severe underestimation of rain along the west coast is attributed to the predominant occurrence of shallow rain and underestimation of moderate to heavy rain by MPEs. The decomposed components suggest that the missed precipitation and hit bias are the leading error sources for the total bias along the west coast. All evaluation metrics are found to be nearly equal in two contrasting monsoon seasons (southwest and northeast), indicating that the performance of MPEs does not change with the season, at least over southeast India. Among various MPEs, the performance of TMPA is found to be better than others, as it reproduced most of the spatial variability exhibited by the reference.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image inpainting is the process of filling the unwanted region in an image marked by the user. It is used for restoring old paintings and photographs, removal of red eyes from pictures, etc. In this paper, we propose an efficient inpainting algorithm which takes care of false edge propagation. We use the classical exemplar based technique to find out the priority term for each patch. To ensure that the edge content of the nearest neighbor patch found by minimizing L-2 distance between patches, we impose an additional constraint that the entropy of the patches be similar. Entropy of the patch acts as a good measure of edge content. Additionally, we fill the image by considering overlapping patches to ensure smoothness in the output. We use structural similarity index as the measure of similarity between ground truth and inpainted image. The results of the proposed approach on a number of examples on real and synthetic images show the effectiveness of our algorithm in removing objects and thin scratches or text written on image. It is also shown that the proposed approach is robust to the shape of the manually selected target. Our results compare favorably to those obtained by existing techniques

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Complex systems inspired analysis suggests a hypothesis that financial meltdowns are abrupt critical transitions that occur when the system reaches a tipping point. Theoretical and empirical studies on climatic and ecological dynamical systems have shown that approach to tipping points is preceded by a generic phenomenon called critical slowing down, i.e. an increasingly slow response of the system to perturbations. Therefore, it has been suggested that critical slowing down may be used as an early warning signal of imminent critical transitions. Whether financial markets exhibit critical slowing down prior to meltdowns remains unclear. Here, our analysis reveals that three major US (Dow Jones Index, S&P 500 and NASDAQ) and two European markets (DAX and FTSE) did not exhibit critical slowing down prior to major financial crashes over the last century. However, all markets showed strong trends of rising variability, quantified by time series variance and spectral function at low frequencies, prior to crashes. These results suggest that financial crashes are not critical transitions that occur in the vicinity of a tipping point. Using a simple model, we argue that financial crashes are likely to be stochastic transitions which can occur even when the system is far away from the tipping point. Specifically, we show that a gradually increasing strength of stochastic perturbations may have caused to abrupt transitions in the financial markets. Broadly, our results highlight the importance of stochastically driven abrupt transitions in real world scenarios. Our study offers rising variability as a precursor of financial meltdowns albeit with a limitation that they may signal false alarms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quantum cellular automata (QCA) is a new technology in the nanometer scale and has been considered as one of the alternative to CMOS technology. In this paper, we describe the design and layout of a serial memory and parallel memory, showing the layout of individual memory cells. Assuming that we can fabricate cells which are separated by 10nm, memory capacities of over 1.6 Gbit/cm2 can be achieved. Simulations on the proposed memories were carried out using QCADesigner, a layout and simulation tool for QCA. During the design, we have tried to reduce the number of cells as well as to reduce the area which is found to be 86.16sq mm and 0.12 nm2 area with the QCA based memory cell. We have also achieved an increase in efficiency by 40%.These circuits are the building block of nano processors and provide us to understand the nano devices of the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose clean localization microscopy (a variant of fPALM) using a molecule filtering technique. Localization imaging involves acquiring a large number of images containing single molecule signatures followed by one-to-one mapping to render a super-resolution image. In principle, this process can be repeated for other z-planes to construct a 3D image. But, single molecules observed from off-focal planes result in false representation of their presence in the focal plane, resulting in incorrect quantification and analysis. We overcome this with a single molecule filtering technique that imposes constraints on the diffraction limited spot size of single molecules in the image plane. Calibration with sub-diffraction size beads puts a natural cutoff on the actual diffraction-limited size of single molecules in the focal plane. This helps in distinguishing beads present in the focal plane from those in the off-focal planes thereby providing an estimate of the single molecules in the focal plane. We study the distribution of actin (labeled with a photoactivatable CAGE 552 dye) in NIH 3T3 mouse fibroblast cells. (C) 2016 Author(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a vision based mobile robot localization strategy. Local scale-invariant features are used as natural landmarks in unstructured and unmodified environment. The local characteristics of the features we use prove to be robust to occlusion and outliers. In addition, the invariance of the features to viewpoint change makes them suitable landmarks for mobile robot localization. Scale-invariant features detected in the first exploration are indexed into a location database. Indexing and voting allow efficient recognition of global localization. The localization result is verified by epipolar geometry between the representative view in database and the view to be localized, thus the probability of false localization will be decreased. The localization system can recover the pose of the camera mounted on the robot by essential matrix decomposition. Then the position of the robot can be computed easily. Both calibrated and un-calibrated cases are discussed and relative position estimation based on calibrated camera turns out to be the better choice. Experimental results show that our approach is effective and reliable in the case of illumination changes, similarity transformations and extraneous features. © 2004 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: ¿Qué imágenes de Egipto encontramos en la Biblia Hebrea, además del éxodo? Los textos bíblicos miran a su vecina Egipto como país de refugio (del hambre o la persecución). Pero por ser grande y fuerte, Egipto es también proveedor de armamento militar. Puede tornarse una falsa seguridad para un pueblo que debe confiar solo en Yavé. La literatura sapiencial es más benévola. En la descripción de la cama lujosa a la que una mujer descarada atrae a su amante, Proverbios 7 ofrece un homenaje indirecto a la riqueza egipcia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Receptor-based detection of pathogens often suffers from non-specific interactions, and as most detection techniques cannot distinguish between affinities of interactions, false positive responses remain a plaguing reality. Here, we report an anharmonic acoustic based method of detection that addresses the inherent weakness of current ligand dependant assays. Spores of Bacillus subtilis (Bacillus anthracis simulant) were immobilized on a thickness-shear mode AT-cut quartz crystal functionalized with anti-spore antibody and the sensor was driven by a pure sinusoidal oscillation at increasing amplitude. Biomolecular interaction forces between the coupled spores and the accelerating surface caused a nonlinear modulation of the acoustic response of the crystal. In particular, the deviation in the third harmonic of the transduced electrical response versus oscillation amplitude of the sensor (signal) was found to be significant. Signals from the specifically-bound spores were clearly distinguishable in shape from those of the physisorbed streptavidin-coated polystyrene microbeads. The analytical model presented here enables estimation of the biomolecular interaction forces from the measured response. Thus, probing biomolecular interaction forces using the described technique can quantitatively detect pathogens and distinguish specific from non-specific interactions, with potential applicability to rapid point-of-care detection. This also serves as a potential tool for rapid force-spectroscopy, affinity-based biomolecular screening and mapping of molecular interaction networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: A poco de obtener el grado de doctor en jurisprudencia en la Universidad de Buenos Aires, Estanislao S. Zeballos, que no obstante su juventud había concretado ya diversas iniciativas culturales y ejercido el periodismo de batalla, adhirió a la revolución encabezada por el general Bartolomé Mitre en 1874 para oponerse a la asunción del presidente Nicolás Avellaneda. Fue nombrado capitán y secretario del comandante en jefe. Durante la campaña escribió un relato de sus días infantiles en la ciudad de Rosario y acerca de su ingreso al Colegio Nacional, y un boceto sobre el clima político previo a la elección de Avellaneda como presidente de la República.