925 resultados para Signature Verification, Forgery Detection, Fuzzy Modeling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Backgrounds Ea aims: The boundaries between the categories of body composition provided by vectorial analysis of bioimpedance are not well defined. In this paper, fuzzy sets theory was used for modeling such uncertainty. Methods: An Italian database with 179 cases 18-70 years was divided randomly into developing (n = 20) and testing samples (n = 159). From the 159 registries of the testing sample, 99 contributed with unequivocal diagnosis. Resistance/height and reactance/height were the input variables in the model. Output variables were the seven categories of body composition of vectorial analysis. For each case the linguistic model estimated the membership degree of each impedance category. To compare such results to the previously established diagnoses Kappa statistics was used. This demanded singling out one among the output set of seven categories of membership degrees. This procedure (defuzzification rule) established that the category with the highest membership degree should be the most likely category for the case. Results: The fuzzy model showed a good fit to the development sample. Excellent agreement was achieved between the defuzzified impedance diagnoses and the clinical diagnoses in the testing sample (Kappa = 0.85, p < 0.001). Conclusions: fuzzy linguistic model was found in good agreement with clinical diagnoses. If the whole model output is considered, information on to which extent each BIVA category is present does better advise clinical practice with an enlarged nosological framework and diverse therapeutic strategies. (C) 2012 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The flow around circular smooth fixed cylinder in a large range of Reynolds numbers is considered in this paper. In order to investigate this canonical case, we perform CFD calculations and apply verification & validation (V&V) procedures to draw conclusions regarding numerical error and, afterwards, assess the modeling errors and capabilities of this (U)RANS method to solve the problem. Eight Reynolds numbers between Re = 10 and Re 5 x 10(5) will be presented with, at least, four geometrically similar grids and five discretization in time for each case (when unsteady), together with strict control of iterative and round-off errors, allowing a consistent verification analysis with uncertainty estimation. Two-dimensional RANS, steady or unsteady, laminar or turbulent calculations are performed. The original 1994 k - omega SST turbulence model by Menter is used to model turbulence. The validation procedure is performed by comparing the numerical results with an extensive set of experimental results compiled from the literature. [DOI: 10.1115/1.4007571]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fraud is a global problem that has required more attention due to an accentuated expansion of modern technology and communication. When statistical techniques are used to detect fraud, whether a fraud detection model is accurate enough in order to provide correct classification of the case as a fraudulent or legitimate is a critical factor. In this context, the concept of bootstrap aggregating (bagging) arises. The basic idea is to generate multiple classifiers by obtaining the predicted values from the adjusted models to several replicated datasets and then combining them into a single predictive classification in order to improve the classification accuracy. In this paper, for the first time, we aim to present a pioneer study of the performance of the discrete and continuous k-dependence probabilistic networks within the context of bagging predictors classification. Via a large simulation study and various real datasets, we discovered that the probabilistic networks are a strong modeling option with high predictive capacity and with a high increment using the bagging procedure when compared to traditional techniques. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are a large number of signal processing techniques that can be used in order to extract interesting information from a measured vibration signal. This study seeks to detect rotating machine defects using a range of techniques including synchronous time averaging, Hilbert transform-based demodulation, continuous wavelet transform, Wigner-Ville distribution and spectral correlation density function. The detection and the diagnostic capability of these techniques are discussed and compared on the basis of experimental results concerning gear tooth faults, i.e. fatigue crack at the tooth root and tooth spalls of different sizes, as well as assembly faults in diesel engine. Moreover, the sensitivity to fault severity is assessed by the application of these signal processing techniques to gear tooth faults of different sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present thesis, a new methodology of diagnosis based on advanced use of time-frequency technique analysis is presented. More precisely, a new fault index that allows tracking individual fault components in a single frequency band is defined. More in detail, a frequency sliding is applied to the signals being analyzed (currents, voltages, vibration signals), so that each single fault frequency component is shifted into a prefixed single frequency band. Then, the discrete Wavelet Transform is applied to the resulting signal to extract the fault signature in the frequency band that has been chosen. Once the state of the machine has been qualitatively diagnosed, a quantitative evaluation of the fault degree is necessary. For this purpose, a fault index based on the energy calculation of approximation and/or detail signals resulting from wavelet decomposition has been introduced to quantify the fault extend. The main advantages of the developed new method over existing Diagnosis techniques are the following: - Capability of monitoring the fault evolution continuously over time under any transient operating condition; - Speed/slip measurement or estimation is not required; - Higher accuracy in filtering frequency components around the fundamental in case of rotor faults; - Reduction in the likelihood of false indications by avoiding confusion with other fault harmonics (the contribution of the most relevant fault frequency components under speed-varying conditions are clamped in a single frequency band); - Low memory requirement due to low sampling frequency; - Reduction in the latency of time processing (no requirement of repeated sampling operation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coagulation factor XIII (FXIII) stabilizes fibrin fibers and is therefore a major player in the maintenance of hemostasis. FXIII is activated by thrombin resulting in cleavage and release of the FXIII activation peptide (AP-FXIII). The objective of this study was to characterize the released AP-FXIII and determine specific features that may be used for its specific detection. We analyzed the structure of bound AP-FXIII within the FXIII A-subunit and interactions of AP-FXIII by hydrogen bonds with both FXIII A-subunit monomers. We optimized our previously developed AP-FXIII ELISA by using 2 monoclonal antibodies. We determined high binding affinities between the antibodies and free AP-FXIII and demonstrated specific binding by epitope mapping analyses with surface plasmon resonance and enzyme-linked immunosorbent assay. Because the structure of free AP-FXIII had been characterized so far by molecular modeling only, we performed structural analysis by nuclear magnetic resonance. Recombinant AP-FXIII was largely flexible both in plasma and water, differing significantly from the rigid structure in the bound state. We suggest that the recognized epitope is either occluded in the noncleaved form or possesses a structure that does not allow binding to the antibodies. On the basis of our findings, we propose AP-FXIII as a possible new marker for acute thrombotic events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smoke spikes occurring during transient engine operation have detrimental health effects and increase fuel consumption by requiring more frequent regeneration of the diesel particulate filter. This paper proposes a decision tree approach to real-time detection of smoke spikes for control and on-board diagnostics purposes. A contemporary, electronically controlled heavy-duty diesel engine was used to investigate the deficiencies of smoke control based on the fuel-to-oxygen-ratio limit. With the aid of transient and steady state data analysis and empirical as well as dimensional modeling, it was shown that the fuel-to-oxygen ratio was not estimated correctly during the turbocharger lag period. This inaccuracy was attributed to the large manifold pressure ratios and low exhaust gas recirculation flows recorded during the turbocharger lag period, which meant that engine control module correlations for the exhaust gas recirculation flow and the volumetric efficiency had to be extrapolated. The engine control module correlations were based on steady state data and it was shown that, unless the turbocharger efficiency is artificially reduced, the large manifold pressure ratios observed during the turbocharger lag period cannot be achieved at steady state. Additionally, the cylinder-to-cylinder variation during this period were shown to be sufficiently significant to make the average fuel-to-oxygen ratio a poor predictor of the transient smoke emissions. The steady state data also showed higher smoke emissions with higher exhaust gas recirculation fractions at constant fuel-to-oxygen-ratio levels. This suggests that, even if the fuel-to-oxygen ratios were to be estimated accurately for each cylinder, they would still be ineffective as smoke limiters. A decision tree trained on snap throttle data and pruned with engineering knowledge was able to use the inaccurate engine control module estimates of the fuel-to-oxygen ratio together with information on the engine control module estimate of the exhaust gas recirculation fraction, the engine speed, and the manifold pressure ratio to predict 94% of all spikes occurring over the Federal Test Procedure cycle. The advantages of this non-parametric approach over other commonly used parametric empirical methods such as regression were described. An application of accurate smoke spike detection in which the injection pressure is increased at points with a high opacity to reduce the cumulative particulate matter emissions substantially with a minimum increase in the cumulative nitrogrn oxide emissions was illustrated with dimensional and empirical modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With recent advances in mass spectrometry techniques, it is now possible to investigate proteins over a wide range of molecular weights in small biological specimens. This advance has generated data-analytic challenges in proteomics, similar to those created by microarray technologies in genetics, namely, discovery of "signature" protein profiles specific to each pathologic state (e.g., normal vs. cancer) or differential profiles between experimental conditions (e.g., treated by a drug of interest vs. untreated) from high-dimensional data. We propose a data analytic strategy for discovering protein biomarkers based on such high-dimensional mass-spectrometry data. A real biomarker-discovery project on prostate cancer is taken as a concrete example throughout the paper: the project aims to identify proteins in serum that distinguish cancer, benign hyperplasia, and normal states of prostate using the Surface Enhanced Laser Desorption/Ionization (SELDI) technology, a recently developed mass spectrometry technique. Our data analytic strategy takes properties of the SELDI mass-spectrometer into account: the SELDI output of a specimen contains about 48,000 (x, y) points where x is the protein mass divided by the number of charges introduced by ionization and y is the protein intensity of the corresponding mass per charge value, x, in that specimen. Given high coefficients of variation and other characteristics of protein intensity measures (y values), we reduce the measures of protein intensities to a set of binary variables that indicate peaks in the y-axis direction in the nearest neighborhoods of each mass per charge point in the x-axis direction. We then account for a shifting (measurement error) problem of the x-axis in SELDI output. After these pre-analysis processing of data, we combine the binary predictors to generate classification rules for cancer, benign hyperplasia, and normal states of prostate. Our approach is to apply the boosting algorithm to select binary predictors and construct a summary classifier. We empirically evaluate sensitivity and specificity of the resulting summary classifiers with a test dataset that is independent from the training dataset used to construct the summary classifiers. The proposed method performed nearly perfectly in distinguishing cancer and benign hyperplasia from normal. In the classification of cancer vs. benign hyperplasia, however, an appreciable proportion of the benign specimens were classified incorrectly as cancer. We discuss practical issues associated with our proposed approach to the analysis of SELDI output and its application in cancer biomarker discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our dynamic capillary electrophoresis model which uses material specific input data for estimation of electroosmosis was applied to investigate fundamental aspects of isoelectric focusing (IEF) in capillaries or microchannels made from bare fused-silica (FS), FS coated with a sulfonated polymer, polymethylmethacrylate (PMMA) and poly(dimethylsiloxane) (PDMS). Input data were generated via determination of the electroosmotic flow (EOF) using buffers with varying pH and ionic strength. Two models are distinguished, one that neglects changes of ionic strength and one that includes the dependence between electroosmotic mobility and ionic strength. For each configuration, the models provide insight into the magnitude and dynamics of electroosmosis. The contribution of each electrophoretic zone to the net EOF is thereby visualized and the amount of EOF required for the detection of the zone structures at a particular location along the capillary, including at its end for MS detection, is predicted. For bare FS, PDMS and PMMA, simulations reveal that EOF is decreasing with time and that the entire IEF process is characterized by the asymptotic formation of a stationary steady-state zone configuration in which electrophoretic transport and electroosmotic zone displacement are opposite and of equal magnitude. The location of immobilization of the boundary between anolyte and most acidic carrier ampholyte is dependent on EOF, i.e. capillary material and anolyte. Overall time intervals for reaching this state in microchannels produced by PDMS and PMMA are predicted to be similar and about twice as long compared to uncoated FS. Additional mobilization for the detection of the entire pH gradient at the capillary end is required. Using concomitant electrophoretic mobilization with an acid as coanion in the catholyte is shown to provide sufficient additional cathodic transport for that purpose. FS capillaries dynamically double coated with polybrene and poly(vinylsulfonate) are predicted to provide sufficient electroosmotic pumping for detection of the entire IEF gradient at the cathodic column end.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Serum-based diagnosis offers the prospect of early lung carcinoma detection and of differentiation between benign and malignant nodules identified by CT. One major challenge toward a future blood-based diagnostic consists in showing that seroreactivity patterns allow for discriminating lung cancer patients not only from normal controls but also from patients with non-tumor lung pathologies. We addressed this question for squamous cell lung cancer, one of the most common lung tumor types. Using a panel of 82 phage-peptide clones, which express potential autoantigens, we performed serological spot assay. We screened 108 sera, including 39 sera from squamous cell lung cancer patients, 29 sera from patients with other non-tumor lung pathologies, and 40 sera from volunteers without known disease. To classify the serum groups, we employed the standard Naïve Bayesian method combined with a subset selection approach. We were able to separate squamous cell lung carcinoma and normal sera with an accuracy of 93%. Low-grade squamous cell lung carcinoma were separated from normal sera with an accuracy of 92.9%. We were able to distinguish squamous cell lung carcinoma from non-tumor lung pathologies with an accuracy of 83%. Three phage-peptide clones with sequence homology to ROCK1, PRKCB1 and KIAA0376 reacted with more than 15% of the cancer sera, but neither with normal nor with non-tumor lung pathology sera. Our study demonstrates that seroreactivity profiles combined with statistical classification methods have great potential for discriminating patients with squamous cell lung carcinoma not only from normal controls but also from patients with non-tumor lung pathologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free radicals are present in cigarette smoke and can have a negative effect on human health by attacking lipids, nucleic acids, proteins and other biologically important species. However, because of the complexity of the tobacco smoke system and the dynamic nature of radicals, little is known about the identity of the radicals, and debate continues on the mechanisms by which those radicals are produced. In this study, acetyl radicals were trapped from the gas phase using 3-amino-2, 2, 5, 5- tetramethyl-proxyl (3AP) on solid support to form stable 3AP adducts for later analysis by high performance liquid chromatography (HPLC), mass spectrometry/tandem mass spectrometry (MS-MS/MS) and liquid chromatography- mass spectrometry (LC-MS). Simulations of acetyl radical generation were performed using Matlab and the Master Chemical Mechanism (MCM) programs. A range of 10- 150 nmol/cigarette of acetyl radical was measured from gas phase tobacco smoke of both commerial and research cigarettes under several different smoking conditions. More radicals were detected from the puff smoking method compared to continuous flow sampling. Approximately twice as many acetyl radicals were trapped when a GF/F particle filter was placed before the trapping zone. Computational simulations show that NO/NO2 reacts with isoprene, initiating chain reactions to produce a hydroxyl radical, which abstracts hydrogen from acetaldehyde to generate acetyl radical. With initial concentrations of NO, acetaldehyde, and isoprene in a real-world cigarette smoke scenario, these mechanisms can account for the full amount of acetyl radical detected experimentally. This study contributes to the overall understanding of the free radical generation in gas phase cigarette smoke.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trichinellosis is a zoonotic disease that is caused by the nematode Trichinella spp. Both European Union regulations and guidelines from the World Organization for Animal Health foresee the possibility of conducting serological surveillance for Trichinella spp. A newly developed commercial enzyme-linked immunosorbent assay (ELISA) was evaluated against 2 existing diagnostic techniques: an in-house ELISA and an in-house Western blot. A total of 875 Trichinella larva-negative samples of pigs and 93 Trichinella larva-positive samples of both naturally and experimentally infected pigs were included in the study. Bayesian modeling techniques were used to correct for the absence of a perfect reference test. The sensitivity and specificity of the commercial ELISA was 97.1-97.8% and 99.5-99.8%, respectively. Sensitivity analysis demonstrated high stability in the models. In a serological surveillance system, ELISA-positive samples should be tested by a confirmatory test. The Western blot is a suitable test for this purpose. With the use of the results of the models, the sensitivity and specificity of a test protocol in both ELISA and Western blot were 95.9% and 99.9%, respectively. The high sensitivity and specificity were achieved with a lower limit of detection than that of the routine artificial digestion test, suggesting that serological surveillance is a valuable alternative in surveillance for Trichinella spp. in pig production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trichinellosis is a zoonotic disease in humans caused by Trichinella spp. According to international regulations and guidelines, serological surveillance can be used to demonstrate the absence of Trichinella spp. in a defined domestic pig population. Most enzyme-linked immunosorbent assay (ELISA) tests presently available do not yield 100% specificity, and therefore, a complementary test is needed to confirm the diagnosis of any initial ELISA seropositivity. The goal of the present study was to evaluate the sensitivity and specificity of a Western Blot assay based on somatic Trichinella spiralis muscle stage (L1) antigen using Bayesian modeling techniques. A total of 295 meat juice and serum samples from pigs negative for Trichinella larvae by artificial digestion, including 74 potentially cross-reactive sera of pigs with other nematode infections, and 93 meat juice samples from pigs infected with Trichinella larvae were included in the study. The diagnostic sensitivity and specificity of the Western Blot were ranged from 95.8% to 96.0% and from 99.5% to 99.6%, respectively. A sensitivity analysis showed that the model outcomes were hardly influenced by changes in the prior distributions, providing a high confidence in the outcomes of the models. This validation study demonstrated that the Western Blot is a suitable method to confirm samples that reacted positively in an initial ELISA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.