832 resultados para accuracy analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been proposed that long-term consumption of diets rich in non-digestible carbohydrates (NDCs), such as cereals, fruit and vegetables might protect against several chronic diseases, however, it has been difficult to fully establish their impact on health in epidemiology studies. The wide range properties of the different NDCs may dilution their impact when they are combined in one category for statistical comparisons in correlations or multivariate analysis. Several mechanisms have been suggested to explain the protective effects of NDCs, including increased stool bulk, dilution of carcinogens in the colonic lumen, reduced transit time, lowering pH, and bacterial fermentation to short chain fatty acids (SCFA) in the colon. However, it is very difficult to measure SCFA in humans in vivo with any accuracy, so epidemiological studies on the impact of SCFA are not feasible. Most studies use dietary fibre (DF) or Non-Starch Polysaccharides (NSP) intake to estimate the levels, but not all fibres or NSP are equally fermentable. It has been proposed that long-term consumption of diets rich in non-digestible carbohydrates (NDCs), such as cereals, fruit and vegetables might protect against several chronic diseases, however, it has been difficult to fully establish their impact on health in epidemiology studies. The wide range properties of the different NDCs may dilution their impact when they are combined in one category for statistical comparisons in correlations or multivariate analysis. Several mechanisms have been suggested to explain the protective effects of NDCs, including increased stool bulk, dilution of carcinogens in the colonic lumen, reduced transit time, lowering pH, and bacterial fermentation to short chain fatty acids (SCFA) in the colon. However, it is very difficult to measure SCFA in humans in vivo with any accuracy, so epidemiological studies on the impact of SCFA are not feasible. Most studies use dietary fibre (DF) or Non-Starch Polysaccharides (NSP) intake to estimate the levels, but not all fibres or NSP are equally fermentable. The first aim of this thesis was the development of the equations used to estimate the amount of FC that reaches the human colon and is fermented fully to SCFA by the colonic bacteria. Therefore, several studies were examined for evidence to determine the different percentages of each type of NDCs that should be included in the final model, based on how much NDCs entered the colon intact and also to what extent they were fermented to SCFA in vivo. Our model equations are FC-DF or NSP$ 1: 100 % Soluble + 10 % insoluble + 100 % NDOs¥ + 5 % TS** FC-DF or NSP 2: 100 % Soluble + 50 % insoluble + 100 % NDOs + 5 % TS FC-DF* or NSP 3: 100 % Soluble + 10 % insoluble + 100 % NDOs + 10 % TS FC-DF or NSP 4: 100 % Soluble + 50 % insoluble + 100 % NDOs + 10 % TS *DF: Dietary fibre; **TS: Total starch; $NSP: non-starch polysaccharide; ¥NDOs: non-digestible oligosaccharide The second study of this thesis aimed to examine all four predicted FC-DF and FC-NSP equations developed, to estimate FC from dietary records against urinary colonic NDCs fermentation biomarkers. The main finding of a cross-sectional comparison of habitual diet with urinary excretion of SCFA products, showed weak but significant correlation between the 24 h urinary excretion of SCFA and acetate with the estimated FC-DF 4 and FC-NSP 4 when considering all of the study participants (n = 122). Similar correlations were observed with the data for valid participants (n = 78). It was also observed that FC-DF and FC-NSP had positive correlations with 24 h urinary acetate and SCFA compared with DF and NSP alone. Hence, it could be hypothesised that using the developed index to estimate FC in the diet form dietary records, might predict SCFA production in the colon in vivo in humans. The next study in this thesis aimed to validate the FC equations developed using in vitro models of small intestinal digestion and human colon fermentation. The main findings in these in vitro studies were that there were several strong agreements between the amounts of SCFA produced after actual in vitro fermentation of single fibre and different mixtures of NDCs, and those predicted by the estimated FC from our developed equation FC-DF 4. These results which demonstrated a strong relationship between SCFA production in vitro from a range of fermentations of single fibres and mixtures of NDCs and that from the predicted FC equation, support the use of the FC equation for estimation of FC from dietary records. Therefore, we can conclude that the newly developed predicted equations have been deemed a valid and practical tool to assess SCFA productions for in vitro fermentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work the split-field finite-difference time-domain method (SF-FDTD) has been extended for the analysis of two-dimensionally periodic structures with third-order nonlinear media. The accuracy of the method is verified by comparisons with the nonlinear Fourier Modal Method (FMM). Once the formalism has been validated, examples of one- and two-dimensional nonlinear gratings are analysed. Regarding the 2D case, the shifting in resonant waveguides is corroborated. Here, not only the scalar Kerr effect is considered, the tensorial nature of the third-order nonlinear susceptibility is also included. The consideration of nonlinear materials in this kind of devices permits to design tunable devices such as variable band filters. However, the third-order nonlinear susceptibility is usually small and high intensities are needed in order to trigger the nonlinear effect. Here, a one-dimensional CBG is analysed in both linear and nonlinear regime and the shifting of the resonance peaks in both TE and TM are achieved numerically. The application of a numerical method based on the finite- difference time-domain method permits to analyse this issue from the time domain, thus bistability curves are also computed by means of the numerical method. These curves show how the nonlinear effect modifies the properties of the structure as a function of variable input pump field. When taking the nonlinear behaviour into account, the estimation of the electric field components becomes more challenging. In this paper, we present a set of acceleration strategies based on parallel software and hardware solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En este trabajo se propone un nuevo sistema híbrido para el análisis de sentimientos en clase múltiple basado en el uso del diccionario General Inquirer (GI) y un enfoque jerárquico del clasificador Logistic Model Tree (LMT). Este nuevo sistema se compone de tres capas, la capa bipolar (BL) que consta de un LMT (LMT-1) para la clasificación de la polaridad de sentimientos, mientras que la segunda capa es la capa de la Intensidad (IL) y comprende dos LMTs (LMT-2 y LMT3) para detectar por separado tres intensidades de sentimientos positivos y tres intensidades de sentimientos negativos. Sólo en la fase de construcción, la capa de Agrupación (GL) se utiliza para agrupar las instancias positivas y negativas mediante el empleo de 2 k-means, respectivamente. En la fase de Pre-procesamiento, los textos son segmentados por palabras que son etiquetadas, reducidas a sus raíces y sometidas finalmente al diccionario GI con el objetivo de contar y etiquetar sólo los verbos, los sustantivos, los adjetivos y los adverbios con 24 marcadores que se utilizan luego para calcular los vectores de características. En la fase de Clasificación de Sentimientos, los vectores de características se introducen primero al LMT-1, a continuación, se agrupan en GL según la etiqueta de clase, después se etiquetan estos grupos de forma manual, y finalmente las instancias positivas son introducidas a LMT-2 y las instancias negativas a LMT-3. Los tres árboles están entrenados y evaluados usando las bases de datos Movie Review y SenTube con validación cruzada estratificada de 10-pliegues. LMT-1 produce un árbol de 48 hojas y 95 de tamaño, con 90,88% de exactitud, mientras que tanto LMT-2 y LMT-3 proporcionan dos árboles de una hoja y uno de tamaño, con 99,28% y 99,37% de exactitud,respectivamente. Los experimentos muestran que la metodología de clasificación jerárquica propuesta da un mejor rendimiento en comparación con otros enfoques prevalecientes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, thermal, exergetic analysis and performance evaluation of seawater and fresh wet cooling tower and the effect of parameters on its performance is investigated. With using of energy and mass balance equations, experimental results, a mathematical model and EES code developed. Due to lack of fresh water, seawater cooling is interesting choice for future of cooling, so the effect of seawater in the range of 1gr/kg to 60gr/kg for salinity on the performance characteristics like air efficiency, water efficiency, output water temperature of cooling tower, flow of the exergy, and the exergy efficiency with comparison with fresh water examined. Decreasing of air efficiency about 3%, increasing of water efficiency about 1.5% are some of these effects. Moreover with formation of fouling the performance of cooling tower decreased about 15% which this phenomena and its effects like increase in output water temperature and tower excess volume has been showed and also accommodate with others work. Also optimization for minimizing cost, maximizing air efficiency, and minimizing exergy destruction has been done, results showed that optimization on minimizing the exergy destruction has been satisfy both minimization of the cost and the maximization of the air efficiency, although it will not necessarily permanent for all inputs and optimizations. Validation of this work is done by comparing computational results and experimental data which showed that the model have a good accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aiming to introduce a multiresidue analysis for the trace detection of pesticide residues belonging to organophosphorus and triazine classes from olive oil samples, a new sample preparation methodology comprising the use of a dual layer of “tailor-made” molecularly imprinted polymers (MIPs) SPE for the simultaneous extraction of both pesticides in a single procedure has been attempted. This work has focused on the implementation of a dual MIP-layer SPE procedure (DL-MISPE) encompassing the use of two MIP layers as specific sorbents. In order to achieve higher recovery rates, the amount of MIP layers has been optimized as well as the influence of MIP packaging order. The optimized DL-MISPE approach has been used in the preconcentration of spiked organic olive oil samples with concentrations of dimethoate and terbuthylazine similar to the maximum residue limits and further quantification by HPLC. High recovery rates for dimethoate (95%) and terbuthylazine (94%) have been achieved with good accuracy and precision. Overall, this work constitutes the first attempt on the development of a dual pesticide residue methodology for the trace analysis of pesticide residues based on molecular imprinting technology. Thus, DL-MISPE constitutes a reliable, robust, and sensitive sample preparation methodology that enables preconcentration of the target pesticides in complex olive oil samples, even at levels similar to the maximum residue limits enforced by the legislation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A ecografia é o exame de primeira linha na identificação e caraterização de tumores anexiais. Foram descritos diversos métodos de diagnóstico diferencial incluindo a avaliação subjetiva do observador, índices descritivos simples e índices matematicamente desenvolvidos como modelos de regressão logística, continuando a avaliação subjectiva por examinador diferenciado a ser o melhor método de discriminação entre tumores malignos e benignos. No entanto, devido à subjectividade inerente a esta avaliação tornouse necessário estabelecer uma nomenclatura padronizada e uma classificação que facilitasse a comunicação de resultados e respectivas recomendações de vigilância. O objetivo deste artigo é resumir e comparar diferentes métodos de avaliação e classificação de tumores anexiais, nomeadamente os modelos do grupo International Ovary Tumor Analysis (IOTA) e a classificação Gynecologic Imaging Report and Data System (GI-RADS), em termos de desempenho diagnóstico e utilidade na prática clínica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wood is considered an ideal solution for floors and roofs building construction, due the mechanical and thermal properties, associated with acoustic conditions. These constructions have good sound absorption, heat insulation and relevant architectonic characteristics. They are used in many civil applications: concert and conference halls, auditoriums, ceilings, walls… However, the high vulnerability of wooden elements submitted to fire conditions requires the evaluation of its structural behaviour with accuracy. The main objective of this work is to present a numerical model to assess the fire resistance of wooden cellular slabs with different perforations. Also the thermal behaviour of the wooden slabs will be compared considering different material insulation, with different sizes, inside the cavities. A transient thermal analysis with nonlinear material behaviour will be solved using ANSYS© program. This study allows to verify the fire resistance, the temperature evolution and the char-layer, throughout a wooden cellular slab with perforations and considering the insulation effect inside the cavities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chronic Chagas disease diagnosis relies on laboratory tests due to its clinical characteristics. The aim of this research was to review commercial enzyme-linked immunosorbent assay (ELISA) and polymerase chain reaction (PCR) diagnostic test performance. Performance of commercial ELISA or PCR for the diagnosis of chronic Chagas disease were systematically searched in PubMed, Scopus, Embase, ISI Web, and LILACS through the bibliography from 1980-2014 and by contact with the manufacturers. The risk of bias was assessed with QUADAS-2. Heterogeneity was estimated with the I2 statistic. Accuracies provided by the manufacturers usually overestimate the accuracy provided by academia. The risk of bias is high in most tests and in most QUADAS dimensions. Heterogeneity is high in either sensitivity, specificity, or both. The evidence regarding commercial ELISA and ELISA-rec sensitivity and specificity indicates that there is overestimation. The current recommendation to use two simultaneous serological tests can be supported by the risk of bias analysis and the amount of heterogeneity but not by the observed accuracies. The usefulness of PCR tests are debatable and health care providers should not order them on a routine basis. PCR may be used in selected cases due to its potential to detect seronegative subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we investigate output accuracy for a Discrete Event Simulation (DES) model and Agent Based Simulation (ABS) model. The purpose of this investigation is to find out which of these simulation techniques is the best one for modelling human reactive behaviour in the retail sector. In order to study the output accuracy in both models, we have carried out a validation experiment in which we compared the results from our simulation models to the performance of a real system. Our experiment was carried out using a large UK department store as a case study. We had to determine an efficient implementation of management policy in the store’s fitting room using DES and ABS. Overall, we have found that both simulation models were a good representation of the real system when modelling human reactive behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Vesicoureteral reflux (VUR) is a common abnormality of the urinary tract in childhood. Objectives: As urine enters the ureters and renal pelvis during voiding in vesicoureteral reflux (VUR), we hypothesized that change in body water composition before and after voiding may be less different in children with VUR. Patients and Methods: Patients were grouped as those with VUR (Group 1) and without VUR (Group 2). Bioelectric impedance analysis was performed before and after voiding, and third space fluid (TSF) (L), percent of total body fluid (TBF%), extracellular fluid (ECF%), and intracellular fluid (ICF%) were recorded. After change of TSF, TBF, ECF, ICF (ΔTSF, ΔTBF%, ΔECF%, ΔICF%), urine volume (mL), and urine volume/body weight (mL/kg) were calculated. Groups 1 and 2 were compared for these parameters. In addition, pre- and post-voiding body fluid values were compared in each group. Results: TBF%, ECF%, ICF%, and TSF in both pre- and post-voiding states and ΔTBF%, ΔECF%, ΔICF%, and ΔTSF after voiding were not different between groups. However, while post-voiding TBF%, ECF% was significantly decreased in Group 1 (64.5 ± 8.1 vs 63.7 ± 7.2, P = 0.013 for TBF%), there was not post-voiding change in TSF in the same group. On the other hand, there was also a significant TSF decrease in Group 2. Conclusions: Bladder and ureter can be considered as the third space. Thus, we think that BIA has been useful in discriminating children with VUR as there was no decreased in patients with VUR, although there was decreased TSF in patients without VUR. However, further studies are needed to increase the accuracy of this hypothesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rapid and efficient Dispersive Liquid–Liquid Microextraction (DLLME) followed by Laser-Induced Breakdown Spectroscopy detection (LIBS) was evaluated for simultaneous determination of Cr, Cu, Mn, Ni and Zn in water samples. Metals in the samples were extracted with tetrachloromethane as pyrrolidinedithiocarbamate (APDC) complexes, using vortex agitation to achieve dispersion of the extractant solvent. Several DLLME experimental factors affecting extraction efficiency were optimized with a multivariate approach. Under optimum DLLME conditions, DLLME-LIBS method was found to be of about 4.0–5.5 times more sensitive than LIBS, achieving limits of detection of about 3.7–5.6 times lower. To assess accuracy of the proposed DLLME-LIBS procedure, a certified reference material of estuarine water was analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to propose and evaluate the use of factor analysis (FA) in obtaining latent variables (factors) that represent a set of pig traits simultaneously, for use in genome-wide selection (GWS) studies. We used crosses between outbred F2 populations of Brazilian Piau X commercial pigs. Data were obtained on 345 F2 pigs, genotyped for 237 SNPs, with 41 traits. FA allowed us to obtain four biologically interpretable factors: ?weight?, ?fat?, ?loin?, and ?performance?. These factors were used as dependent variables in multiple regression models of genomic selection (Bayes A, Bayes B, RR-BLUP, and Bayesian LASSO). The use of FA is presented as an interesting alternative to select individuals for multiple variables simultaneously in GWS studies; accuracy measurements of the factors were similar to those obtained when the original traits were considered individually. The similarities between the top 10% of individuals selected by the factor, and those selected by the individual traits, were also satisfactory. Moreover, the estimated markers effects for the traits were similar to those found for the relevant factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.