935 resultados para Memory-based


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of the entropy change at the martensitic transition of two composition-related sets of Cu-Al-Mn shape-memory alloys are reported. It is found that most of the entropy change has a vibrational origin, and depends only on the particular close-packed structure of the low-temperature phase. Using data from the literature for other Cu-based alloys, this result is shown to be general. In addition, it is shown that the martensitic structure changes from 18R to 2H when the ratio of conduction electrons per atom reaches the same value as the eutectoid point in the equilibrium phase diagram. This finding indicates that the structure of the metastable low-temperature phase is reminiscent of the equilibrium structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperative caching in mobile ad hoc networks aims at improving the efficiency of information access by reducing access latency and bandwidth usage. Cache replacement policy plays a vital role in improving the performance of a cache in a mobile node since it has limited memory. In this paper we propose a new key based cache replacement policy called E-LRU for cooperative caching in ad hoc networks. The proposed scheme for replacement considers the time interval between the recent references, size and consistency as key factors for replacement. Simulation study shows that the proposed replacement policy can significantly improve the cache performance in terms of cache hit ratio and query delay

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metglas 2826 MB having a nominal composition of Fe40Ni38Mo4B18 is an excellent soft magnetic material and finds application in sensors and memory heads. However, the thin-film forms of Fe40Ni38Mo4B18 are seldom studied, although they are important in micro-electro-mechanical systems/nano-electromechanical systems devices. The stoichiometry of the film plays a vital role in determining the structural and magnetic properties of Fe40Ni38Mo4B18 thin films: retaining the composition in thin films is a challenge. Thin films of 52 nm thickness were fabricated by RF sputtering technique on silicon substrate from a target of nominal composition of Fe40Ni38Mo4B18. The films were annealed at temperatures of 400 °C and 600 °C. The micro-structural studies of films using glancing x-ray diffractometer (GXRD) and transmission electron microscope (TEM) revealed that pristine films are crystalline with (FeNiMo)23B6 phase. Atomic force microscope (AFM) images were subjected to power spectral density analysis to understand the probable surface evolution mechanism during sputtering and annealing. X-ray photoelectron spectroscopy (XPS) was employed to determine the film composition. The sluggish growth of crystallites with annealing is attributed to the presence of molybdenum in the thin film. The observed changes in magnetic properties were correlated with annealing induced structural, compositional and morphological changes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer treatment is most effective when it is detected early and the progress in treatment will be closely related to the ability to reduce the proportion of misses in the cancer detection task. The effectiveness of algorithms for detecting cancers can be greatly increased if these algorithms work synergistically with those for characterizing normal mammograms. This research work combines computerized image analysis techniques and neural networks to separate out some fraction of the normal mammograms with extremely high reliability, based on normal tissue identification and removal. The presence of clustered microcalcifications is one of the most important and sometimes the only sign of cancer on a mammogram. 60% to 70% of non-palpable breast carcinoma demonstrates microcalcifications on mammograms [44], [45], [46].WT based techniques are applied on the remaining mammograms, those are obviously abnormal, to detect possible microcalcifications. The goal of this work is to improve the detection performance and throughput of screening-mammography, thus providing a ‘second opinion ‘ to the radiologists. The state-of- the- art DWT computation algorithms are not suitable for practical applications with memory and delay constraints, as it is not a block transfonn. Hence in this work, the development of a Block DWT (BDWT) computational structure having low processing memory requirement has also been taken up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bank switching in embedded processors having partitioned memory architecture results in code size as well as run time overhead. An algorithm and its application to assist the compiler in eliminating the redundant bank switching codes introduced and deciding the optimum data allocation to banked memory is presented in this work. A relation matrix formed for the memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Data allocation to memory is done by considering all possible permutation of memory banks and combination of data. The compiler output corresponding to each data mapping scheme is subjected to a static machine code analysis which identifies the one with minimum number of bank switching codes. Even though the method is compiler independent, the algorithm utilizes certain architectural features of the target processor. A prototype based on PIC 16F87X microcontrollers is described. This method scales well into larger number of memory blocks and other architectures so that high performance compilers can integrate this technique for efficient code generation. The technique is illustrated with an example

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis explores the area of still image compression. The image compression techniques can be broadly classified into lossless and lossy compression. The most common lossy compression techniques are based on Transform coding, Vector Quantization and Fractals. Transform coding is the simplest of the above and generally employs reversible transforms like, DCT, DWT, etc. Mapped Real Transform (MRT) is an evolving integer transform, based on real additions alone. The present research work aims at developing new image compression techniques based on MRT. Most of the transform coding techniques employ fixed block size image segmentation, usually 8×8. Hence, a fixed block size transform coding is implemented using MRT and the merits and demerits are analyzed for both 8×8 and 4×4 blocks. The N2 unique MRT coefficients, for each block, are computed using templates. Considering the merits and demerits of fixed block size transform coding techniques, a hybrid form of these techniques is implemented to improve the performance of compression. The performance of the hybrid coder is found to be better compared to the fixed block size coders. Thus, if the block size is made adaptive, the performance can be further improved. In adaptive block size coding, the block size may vary from the size of the image to 2×2. Hence, the computation of MRT using templates is impractical due to memory requirements. So, an adaptive transform coder based on Unique MRT (UMRT), a compact form of MRT, is implemented to get better performance in terms of PSNR and HVS The suitability of MRT in vector quantization of images is then experimented. The UMRT based Classified Vector Quantization (CVQ) is implemented subsequently. The edges in the images are identified and classified by employing a UMRT based criteria. Based on the above experiments, a new technique named “MRT based Adaptive Transform Coder with Classified Vector Quantization (MATC-CVQ)”is developed. Its performance is evaluated and compared against existing techniques. A comparison with standard JPEG & the well-known Shapiro’s Embedded Zero-tree Wavelet (EZW) is done and found that the proposed technique gives better performance for majority of images

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a type-based approach to statically derive symbolic closed-form formulae that characterize the bounds of heap memory usages of programs written in object-oriented languages. Given a program with size and alias annotations, our inference system will compute the amount of memory required by the methods to execute successfully as well as the amount of memory released when methods return. The obtained analysis results are useful for networked devices with limited computational resources as well as embedded software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the application of a PCA analysis on categorical data prior to diagnose a patients data set using a Case-Based Reasoning (CBR) system. The particularity is that the standard PCA techniques are designed to deal with numerical attributes, but our medical data set contains many categorical data and alternative methods as RS-PCA are required. Thus, we propose to hybridize RS-PCA (Regular Simplex PCA) and a simple CBR. Results show how the hybrid system produces similar results when diagnosing a medical data set, that the ones obtained when using the original attributes. These results are quite promising since they allow to diagnose with less computation effort and memory storage

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El test de circuits és una fase del procés de producció que cada vegada pren més importància quan es desenvolupa un nou producte. Les tècniques de test i diagnosi per a circuits digitals han estat desenvolupades i automatitzades amb èxit, mentre que aquest no és encara el cas dels circuits analògics. D'entre tots els mètodes proposats per diagnosticar circuits analògics els més utilitzats són els diccionaris de falles. En aquesta tesi se'n descriuen alguns, tot analitzant-ne els seus avantatges i inconvenients. Durant aquests últims anys, les tècniques d'Intel·ligència Artificial han esdevingut un dels camps de recerca més importants per a la diagnosi de falles. Aquesta tesi desenvolupa dues d'aquestes tècniques per tal de cobrir algunes de les mancances que presenten els diccionaris de falles. La primera proposta es basa en construir un sistema fuzzy com a eina per identificar. Els resultats obtinguts son força bons, ja que s'aconsegueix localitzar la falla en un elevat tant percent dels casos. Per altra banda, el percentatge d'encerts no és prou bo quan a més a més s'intenta esbrinar la desviació. Com que els diccionaris de falles es poden veure com una aproximació simplificada al Raonament Basat en Casos (CBR), la segona proposta fa una extensió dels diccionaris de falles cap a un sistema CBR. El propòsit no és donar una solució general del problema sinó contribuir amb una nova metodologia. Aquesta consisteix en millorar la diagnosis dels diccionaris de falles mitjançant l'addició i l'adaptació dels nous casos per tal d'esdevenir un sistema de Raonament Basat en Casos. Es descriu l'estructura de la base de casos així com les tasques d'extracció, de reutilització, de revisió i de retenció, fent èmfasi al procés d'aprenentatge. En el transcurs del text s'utilitzen diversos circuits per mostrar exemples dels mètodes de test descrits, però en particular el filtre biquadràtic és l'utilitzat per provar les metodologies plantejades, ja que és un dels benchmarks proposats en el context dels circuits analògics. Les falles considerades son paramètriques, permanents, independents i simples, encara que la metodologia pot ser fàcilment extrapolable per a la diagnosi de falles múltiples i catastròfiques. El mètode es centra en el test dels components passius, encara que també es podria extendre per a falles en els actius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study of the formation and propagation of volume anomalies in North Atlantic Mode Waters is presented, based on 100 yr of monthly mean fields taken from the control run of the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3). Analysis of the temporal and. spatial variability in the thickness between pairs of isothermal surfaces bounding the central temperature of the three main North Atlantic subtropical mode waters shows that large-scale variability in formation occurs over time scales ranging from 5 to 20 yr. The largest formation anomalies are associated with a southward shift in the mixed layer isothermal distribution, possibly due to changes in the gyre dynamics and/or changes in the overlying wind field and air-sea heat fluxes. The persistence of these anomalies is shown to result from their subduction beneath the winter mixed layer base where they recirculate around the subtropical gyre in the background geostrophic flow. Anomalies in the warmest mode (18 degrees C) formed on the western side of the basin persist for up to 5 yr. They are removed by mixing transformation to warmer classes and are returned to the seasonal mixed layer near the Gulf Stream where the stored heat may be released to the atmosphere. Anomalies in the cooler modes (16 degrees and 14 degrees C) formed on the eastern side of the basin persist for up to 10 yr. There is no clear evidence of significant transformation of these cooler mode anomalies to adjacent classes. It has been proposed that the eastern anomalies are removed through a tropical-subtropical water mass exchange mechanism beneath the trade wind belt (south of 20 degrees N). The analysis shows that anomalous mode water formation plays a key role in the long-term storage of heat in the model, and that the release of heat associated with these anomalies suggests a predictable climate feedback mechanism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article reviews current technological developments, particularly Peer-to-Peer technologies and Distributed Data Systems, and their value to community memory projects, particularly those concerned with the preservation of the cultural, literary and administrative data of cultures which have suffered genocide or are at risk of genocide. It draws attention to the comparatively good representation online of genocide denial groups and changes in the technological strategies of holocaust denial and other far-right groups. It draws on the author's work in providing IT support for a UK-based Non-Governmental Organization providing support for survivors of genocide in Rwanda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Investigations of memory deficits in older individuals have concentrated on their increased likelihood of forgetting events or details of events that were actually encountered (errors of omission). However mounting evidence demonstrates that normal cognitive aging also is associated with an increased propensity for errors of commission-shown in false alarms or false recognition. The present study examined the origins of this age difference. Older and younger adults each performed three types of memory tasks in which details of encountered items might influence performance. Although older adults showed greater false recognition of related lures on a standard (identical) old/new episodic recognition task, older and younger adults showed parallel effects of detail on repetition priming and meaning-based episodic recognition (decreased priming and decreased meaning-based recognition for different relative to same exemplars). The results suggest that the older adults encoded details but used them less effectively than the younger adults in the recognition context requiring their deliberate, controlled use.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The modelling of a nonlinear stochastic dynamical processes from data involves solving the problems of data gathering, preprocessing, model architecture selection, learning or adaptation, parametric evaluation and model validation. For a given model architecture such as associative memory networks, a common problem in non-linear modelling is the problem of "the curse of dimensionality". A series of complementary data based constructive identification schemes, mainly based on but not limited to an operating point dependent fuzzy models, are introduced in this paper with the aim to overcome the curse of dimensionality. These include (i) a mixture of experts algorithm based on a forward constrained regression algorithm; (ii) an inherent parsimonious delaunay input space partition based piecewise local lineal modelling concept; (iii) a neurofuzzy model constructive approach based on forward orthogonal least squares and optimal experimental design and finally (iv) the neurofuzzy model construction algorithm based on basis functions that are Bézier Bernstein polynomial functions and the additive decomposition. Illustrative examples demonstrate their applicability, showing that the final major hurdle in data based modelling has almost been removed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study investigated the influence of encoding modality and cue-action relatedness on prospective memory (PM) performance in young and older adults using a modified version of the Virtual Week task. Participants encoded regular and irregular intentions either verbally or by physically performing the action during encoding. For half of the intentions there was a close semantic relation between the retrieval cue and the intended action, while for the remaining intentions the cue and action were semantically unrelated. For irregular tasks, both age groups showed superior PM for related intentions compared to unrelated intentions in both encoding conditions. While older adults retrieved fewer irregular intentions than young adults after verbal encoding, there was no age difference following enactment. Possible mechanisms of enactment and relatedness effects are discussed in the context of current theories of event-based PM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The plethora, and mass take up, of digital communication tech- nologies has resulted in a wealth of interest in social network data collection and analysis in recent years. Within many such networks the interactions are transient: thus those networks evolve over time. In this paper we introduce a class of models for such networks using evolving graphs with memory dependent edges, which may appear and disappear according to their recent history. We consider time discrete and time continuous variants of the model. We consider the long term asymptotic behaviour as a function of parameters controlling the memory dependence. In particular we show that such networks may continue evolving forever, or else may quench and become static (containing immortal and/or extinct edges). This depends on the ex- istence or otherwise of certain infinite products and series involving age dependent model parameters. To test these ideas we show how model parameters may be calibrated based on limited samples of time dependent data, and we apply these concepts to three real networks: summary data on mobile phone use from a developing region; online social-business network data from China; and disaggregated mobile phone communications data from a reality mining experiment in the US. In each case we show that there is evidence for memory dependent dynamics, such as that embodied within the class of models proposed here.