832 resultados para accuracy analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: During last decade the use of ECG recordings in biometric recognition studies has increased. ECG characteristics made it suitable for subject identification: it is unique, present in all living individuals, and hard to forge. However, in spite of the great number of approaches found in literature, no agreement exists on the most appropriate methodology. This study aimed at providing a survey of the techniques used so far in ECG-based human identification. Specifically, a pattern recognition perspective is here proposed providing a unifying framework to appreciate previous studies and, hopefully, guide future research. Methods: We searched for papers on the subject from the earliest available date using relevant electronic databases (Medline, IEEEXplore, Scopus, and Web of Knowledge). The following terms were used in different combinations: electrocardiogram, ECG, human identification, biometric, authentication and individual variability. The electronic sources were last searched on 1st March 2015. In our selection we included published research on peer-reviewed journals, books chapters and conferences proceedings. The search was performed for English language documents. Results: 100 pertinent papers were found. Number of subjects involved in the journal studies ranges from 10 to 502, age from 16 to 86, male and female subjects are generally present. Number of analysed leads varies as well as the recording conditions. Identification performance differs widely as well as verification rate. Many studies refer to publicly available databases (Physionet ECG databases repository) while others rely on proprietary recordings making difficult them to compare. As a measure of overall accuracy we computed a weighted average of the identification rate and equal error rate in authentication scenarios. Identification rate resulted equal to 94.95 % while the equal error rate equal to 0.92 %. Conclusions: Biometric recognition is a mature field of research. Nevertheless, the use of physiological signals features, such as the ECG traits, needs further improvements. ECG features have the potential to be used in daily activities such as access control and patient handling as well as in wearable electronics applications. However, some barriers still limit its growth. Further analysis should be addressed on the use of single lead recordings and the study of features which are not dependent on the recording sites (e.g. fingers, hand palms). Moreover, it is expected that new techniques will be developed using fiducials and non-fiducial based features in order to catch the best of both approaches. ECG recognition in pathological subjects is also worth of additional investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the results of our data mining study of Pb-Zn (lead-zinc) ore assay records from a mine enterprise in Bulgaria. We examined the dataset, cleaned outliers, visualized the data, and created dataset statistics. A Pb-Zn cluster data mining model was created for segmentation and prediction of Pb-Zn ore assay data. The Pb-Zn cluster data model consists of five clusters and DMX queries. We analyzed the Pb-Zn cluster content, size, structure, and characteristics. The set of the DMX queries allows for browsing and managing the clusters, as well as predicting ore assay records. A testing and validation of the Pb-Zn cluster data mining model was developed in order to show its reasonable accuracy before beingused in a production environment. The Pb-Zn cluster data mining model can be used for changes of the mine grinding and floatation processing parameters in almost real-time, which is important for the efficiency of the Pb-Zn ore beneficiation process. ACM Computing Classification System (1998): H.2.8, H.3.3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes patients might suffer from an unhealthy life, long-term treatment and chronic complicated diseases. The decreasing hospitalization rate is a crucial problem for health care centers. This study combines the bagging method with base classifier decision tree and costs-sensitive analysis for diabetes patients' classification purpose. Real patients' data collected from a regional hospital in Thailand were analyzed. The relevance factors were selected and used to construct base classifier decision tree models to classify diabetes and non-diabetes patients. The bagging method was then applied to improve accuracy. Finally, asymmetric classification cost matrices were used to give more alternative models for diabetes data analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Color information is widely used in non-destructive quality assessment of perishable horticultural produces. The presented work investigated color changes of pepper (Capsicum annuum L.) samples received from retail system. The effect of storage temperature (10±2°C and 24±4°C) on surface color and firmness was analyzed. Hue spectra was calculated using sum of saturations. A ColorLite sph850 (400-700nm) spectrophotometer was used as reference instrument. Dynamic firmness was measured on three locations of the surface: tip cap, middle and shoulder. Significant effects of storage conditions and surface location on both color and firmness were observed. Hue spectra responded sensitively to color development of pepper. Prediction model (PLS) was used to estimate dynamic firmess based on hue spectra. Accuracy was very different depending on the location. Firmness of the tip cap was predicted with the highest accuracy (RMSEP=0.0335). On the other hand, middle region cannot be used for such purpose. Due to the simplicity and rapid processing, analysis of hue spectra is a promising tool for evaluation of color in postharvest and food industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study is an exploratory analysis of an operational measure for resource development strategies, and an exploratory analysis of internal organizational contingencies influencing choices of these strategies in charitable nonprofit organizations. The study provides conceptual guidance for advancing understanding about resource development in the nonprofit sector. The statistical findings are, however, inconclusive without further rigorous examination. A three category typology based on organization technology is initially presented to define the strategies. Three dimensions of internal organizational contingencies explored represent organization identity, professional staff, and boards of directors. Based on relevant literature and key informant interviews, an original survey was administered by mail to a national sample of nonprofit organizations. The survey collected data on indicators of the proposed strategy types and selected contingencies. Factor analysis extracted two of the initial categories in the typology. The Building Resource Development Infrastructure Strategy encompasses information technology, personnel, legal structures, and policies facilitating fund development. The Building Resource Development Infrastructure Strategy encompasses the mission, service niche, and type of service delivery forming the basis for seeking financial support. Linear regressions with each strategy type as the dependent variable identified distinct and common contingencies which may partly explain choices of strategies. Discriminant analysis suggests the potential predictive accuracy of the contingencies. Follow-up case studies with survey respondents provide additional criteria for operationalizing future measures of resource development strategies, and support and expand the analysis on contingencies. The typology offers a beginning framework for defining alternative approaches to resource development, and for exploring organization capacity specific to each approach. Contingencies that may be integral components of organization capacity are funding, leadership frame, background and experience, staff and volunteer effort, board member support, and relationships in the external environment. Based on these findings, management questions are offered for nonprofit organization stakeholders to consider in planning for resource development. Lessons learned in designing and conducting this study are also provided to enhance future related research. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct specification of the simple location quotients in regionalizing the national direct requirements table is essential to the accuracy of regional input-output multipliers. The purpose of this research is to examine the relative accuracy of these multipliers when earnings, employment, number of establishments, and payroll data specify the simple location quotients.^ For each specification type, I derive a column of total output multipliers and a column of total income multipliers. These multipliers are based on the 1987 benchmark input-output accounts of the U.S. economy and 1988-1992 state of Florida data.^ Error sign tests, and Standardized Mean Absolute Deviation (SMAD) statistics indicate that the output multiplier estimates overestimate the output multipliers published by the Department of Commerce-Bureau of Economic Analysis (BEA) for the state of Florida. In contrast, the income multiplier estimates underestimate the BEA's income multipliers. For a given multiplier type, the Spearman-rank correlation analysis shows that the multiplier estimates and the BEA multipliers have statistically different rank ordering of row elements. The above tests also find no significant different differences, both in size and ranking distributions, among the vectors of multiplier estimates. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Intoxilyzer 5000 was tested for calibration curve linearity for ethanol vapor concentration between 0.020 and 0.400g/210L with excellent linearity. Calibration error using reference solutions outside of the allowed concentration range, response to the same ethanol reference solution at different temperatures between 34 and 38$\sp\circ$C, and its response to eleven chemicals, 10 mixtures of two at the time, and one mixture of four chemicals potentially found in human breath have been evaluated. Potential interferents were chosen on the basis of their infrared signatures and the concentration range of solutions corresponding to the non-lethal blood concentration range of various volatile organic compounds reported in the literature. The result of this study indicates that the instrument calibrates with solutions outside the allowed range up to $\pm$10% of target value. Headspace FID dual column GC analysis was used to confirm the concentrations of the solutions. Increasing the temperature of the reference solution from 34 to 38$\sp\circ$C resulted in linear increases in instrument recorded ethanol readings with an average increase of 6.25%/$\sp\circ$C. Of the eleven chemicals studied during this experiment, six, isopropanol, toluene, methyl ethyl ketone, trichloroethylene, acetaldehyde, and methanol could reasonably interfere with the test at non-lethal reported blood concentration ranges, the mixtures of those six chemicals showed linear additive results with a combined effect of as much as a 0.080g/210L reading (Florida's legal limit) without any ethanol present. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, μXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was a qualitative investigation to ascertain and describe two of the current issues at the International Community School of Abidjan, examine their historical bases, and analyze their impact on the school environment.^ Two issues emerged during the inquiry phase of this study: (1) the relationship between local-hired and overseas-hired teachers in light of the January 1994 devaluation which polarized the staff by negating a four-year salary scale that established equity, (2) the school community's wide variance in the perceived power that the U.S. Embassy has on school operations based on its role as ICSA's founding sponsor.^ A multiple studies approach was used in gathering data. An extensive examination of the school's archives was used to reconstruct an historical overview of ICSA. An initial questionnaire was distributed to teachers and administrators at an educational conference to determine the scope of the 1994 devaluation of the West and Central African CFA and its impact on school personnel in West African American-sponsored overseas schools (ASOS). Personal interviews were conducted with the school staff, administration, school board members, and relevant historical participants to determine the principal issues at ICSA at that time. The researcher, an overseas-hired teacher, also used participant observations to collect data. Findings based on these sources were used to analyze the two issues from an historical perspective and to form conclusions.^ Findings in this study pertaining to the events induced by the French and African governments' decision to implement a currency devaluation in January 1994 were presented in ex post-facto chronological narrative form to describe the events which transpired, describe the perception of school personnel involved in these events, examine the final resolution and interpret these events within a historical framework for analysis.^ The topic of the U.S. Embassy and its role at ICSA emerged inductively from open-ended personal interviews conducted over the course of a year. Contradictory perspectives were examined and researched for accuracy and cause. The results of this inquiry presented the U.S. Embassy role at ICSA from a two-sided perspective, examined the historical role of the Embassy, and presented means by which the role and responsibility of the U.S. Embassy could best be communicated to the school community.^ The final chapter provides specific actions for mediation of problems stemming from these issues, implications for administrators and teachers currently involved in overseas schools or considering the possibility, and suggestions for future inquiries.^ Examination of a two-tier salary scale for local-hired and overseas-hired teachers generated the following recommendations: movement towards a single salary scale when feasible, clearly stated personnel policies and full disclosure of benefits, a uniform certification standard, professional development programs and awareness of the impact of this issue on staff morale.^ Divergent perceptions and attitudes toward the role of the U.S. Embassy produced these recommendations: a view towards limiting the number of Americans on ASOS school boards, open school board meetings, selection of Embassy Administrative Officers who can educate school communities on the exact role of the Embassy, educating parents through the outreach activities that communicate American educational philosophy and involve all segments of the international community, and a firm effort on the part of the ASOS to establish the school's autonomy from special interests. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hearing of the news of the death of Diana, Princess of Wales, in a traffic accident, is taken as an analogue for being a percipient but uninvolved witness to a crime, or a witness to another person's sudden confession to some illegal act. This event (known in the literature as a “reception event”) has previously been hypothesized to cause one to form a special type of memory commonly known as a “flashbulb memory” (FB) (Brown and Kulik, 1977). FB's are hypothesized to be especially resilient against forgetting, highly detailed including peripheral details, clear, and inspiring great confidence in the individual for their accuracy. FB's are dependent for their formation upon surprise, emotional valence, and impact, or consequentiality to the witness of the initiating event. FB's are thought to be enhanced by frequent rehearsal. FB's are very important in the context of criminal investigation and litigation in that investigators and jurors usually place great store in witnesses, regardless of their actual accuracy, who claim to have a clear and complete recollection of an event, and who express this confidently. Therefore, the lives, or at least the freedom, of criminal defendants, and the fortunes of civil litigants hang on the testimony of witnesses professing to have FB's. ^ In this study, which includes a large and diverse sample (N = 305), participants were surveyed within 2–4 days after hearing of the fatal accident, and again at intervals of 2 and 4 weeks, 6, 12, and 18 months. Contrary to the FB hypothesis, I found that participants' FB's degraded over time beginning at least as early as two weeks post event. At about 12 months the memory trace stabilized, resisting further degradation. Repeated interviewing did not have any negative affect upon accuracy, contrary to concerns in the literature. Analysis by correlation and regression indicated no effect or predictive power for participant age, emotionality, confidence, or student status, as related to accuracy of recall; nor was participant confidence in accuracy predicted by emotional impact as hypothesized. Results also indicate that, contrary to the notions of investigators and jurors, witnesses become more inaccurate over time regardless of their confidence in their memories, even for highly emotional events. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for elemental analysis of biological matrices such as bone, teeth, and plant matter for sourcing purposes has emerged within the forensic and geochemical laboratories. Trace elemental analyses for the comparison of materials such as glass by inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation ICP-MS has been shown to offer a high degree of discrimination between different manufacturing sources. Unit resolution ICP-MS instruments may suffer from some polyatomic interferences including 40Ar16O+, 40Ar 16O1H+, and 40Ca 16O+ that affect iron measurement at trace levels. Iron is an important element in the analysis of glass and also of interest for the analysis of several biological matrices. A comparison of the analytical performance of two different ICP-MS systems for iron analysis in glass for determining the method detection limits (MDLs), accuracy, and precision of the measurement is presented. Acid digestion and laser ablation methods are also compared. Iron polyatomic interferences were reduced or resolved by using dynamic reaction cell and high resolution ICP-MS. MDLs as low as 0.03 μg g-1 and 0.14 μg g-1 for laser ablation and solution based analyses respectively were achieved. The use of helium as a carrier gas demonstrated improvement in the detection limits of both iron isotopes (56Fe and 57Fe) in medium resolution for the HR-ICP-MS and with a dynamic reaction cell (DRC) coupled to a quadrupole ICP-MS system. ^ The development and application of robust analytical methods for the quantification of trace elements in biological matrices has lead to a better understanding of the potential utility of these measurements in forensic chemical analyses. Standard reference materials (SRMs) were used in the development of an analytical method using HR-ICP-MS and LA-HR-ICP-MS that was subsequently applied on the analysis of real samples. Bone, teeth and ashed marijuana samples were analyzed with the developed method. ^ Elemental analysis of bone samples from 12 different individuals provided discrimination between individuals, when femur and humerus bones were considered separately. Discrimination of 14 teeth samples based on elemental composition was achieved with the exception of one case where samples from the same individual were not associated with each other. The discrimination of 49 different ashed plant (cannabis) samples was achieved using the developed method. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New designer drugs are constantly emerging onto the illicit drug market and it is often difficult to validate and maintaincomprehensive analytical methods for accurate detection of these compounds. Generally, toxicology laboratories utilize a screening method, such as immunoassay, for the presumptive identification of drugs of abuse. When a positive result occurs, confirmatory methods, such as gas chromatography (GC) or liquid chromatography (LC) coupled with mass spectrometry (MS), are required for more sensitive and specific analyses. In recent years, the need to study the activities of these compounds in screening assays as well as to develop confirmatory techniques to detect them in biological specimens has been recognized. Severe intoxications and fatalities have been encountered with emerging designer drugs, presenting analytical challenges for detection and identification of such novel compounds. The first major task of this research was to evaluate the performance of commercially available immunoassays to determine if designer drugs were cross-reactive. The second major task was to develop and validate a confirmatory method, using LC-MS, to identify and quantify these designer drugs in biological specimens.^ Cross-reactivity towards the cathinone derivatives was found to be minimal. Several other phenethylamines demonstrated cross-reactivity at low concentrations, but results were consistent with those published by the assay manufacturer or as reported in the literature. Current immunoassay-based screening methods may not be ideal for presumptively identifying most designer drugs, including the "bath salts." For this reason, an LC-MS based confirmatory method was developed for 32 compounds, including eight cathinone derivatives, with limits of quantification in the range of 1-10 ng/mL. The method was fully validated for selectivity, matrix effects, stability, recovery, precision, and accuracy. In order to compare the screening and confirmatory techniques, several human specimens were analyzed to demonstrate the importance of using a specific analytical method, such as LC-MS, to detect designer drugs in serum as immunoassays lack cross-reactivity with the novel compounds. Overall, minimal cross-reactivity was observed, highlighting the conclusion that these presumptive screens cannot detect many of the designer drugs and that a confirmatory technique, such as the LC-MS, is required for the comprehensive forensic toxicological analysis of designer drugs.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whereas previous research has demonstrated that trait ratings of faces at encoding leads to enhanced recognition accuracy as compared to feature ratings, this set of experiments examines whether ratings given after encoding and just prior to recognition influence face recognition accuracy. In Experiment 1 subjects who made feature ratings just prior to recognition were significantly less accurate than subjects who made no ratings or trait ratings. In Experiment 2 ratings were manipulated at both encoding and retrieval. The retrieval effect was smaller and nonsignificant, but a combined probability analysis showed that it was significant when results from both experiments are considered jointly. In a third experiment exposure duration at retrieval, a potentially confounding factor in Experiments 1 and 2, had a nonsignificant effect on recognition accuracy, suggesting that it probably does not explain the results from Experiments 1 and 2. These experiments demonstrate that face recognition accuracy can be influenced by processing instructions at retrieval.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine which of the two methods is more appropriate to teach pitch discrimination to Grade 6 choral students to improve sight-singing note accuracy. This study consisted of three phases: pre-testing, instruction and post-testing. During the four week study, the experimental group received training using the Kodaly method while the control group received training using the traditional method. The pre and post tests were evaluated by three trained musicians. The analysis of the data utilized an independent t-test and a paired t-test with the methods of teaching (experimental and control) as a factor. Quantitative results suggest that the experimental subjects, those receiving Kodaly instruction at post-treatment showed a significant improvement in the pitch accuracy than the control group. The specific change resulted in the Kodaly method to be more effective in producing accurate pitch in sight-singing.