832 resultados para accuracy analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Television (TV) reaches more people than any other medium which makes it an important source of health information. Since TV ads often offer information obliquely, this study investigated implied health messages found in food and nutrition TV ads. The goals were to determine the proportion of food and nutrition ads among all TV advertising and to use content analysis to identify their implied messages and health claims. A randomly selected sample of TV ads were collected over a 28-day period beginning May 8, 1987. The sample contained 3547 ads; 725 (20%) were food-related. All were analyzed. About 10% of food-related TV ads contained a health claim. Twenty-five representative ads of the 725 food ads were also reviewed by 10 dietitians to test the reliability of the instrument. Although the dietitians agreed upon whether a health claim existed in a televised food ad, their agreement was poor when evaluating the accuracy of the claim. The number of food-related ads dropped significantly on Saturday, but the number of alcohol ads rose sharply on Saturday and Sunday. Snack ads were shown more often on Thursday, but snack commercials were also numerous on Saturday morning and afternoon, as were cereal ads. Ads for snack foods accounted for the greatest proportion of ads (20%) while fast food accounted for only 7%. Alcohol constituted about 9% of all food and nutrition ads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Engineering analysis in geometric models has been the main if not the only credible/reasonable tool used by engineers and scientists to resolve physical boundaries problems. New high speed computers have facilitated the accuracy and validation of the expected results. In practice, an engineering analysis is composed of two parts; the design of the model and the analysis of the geometry with the boundary conditions and constraints imposed on it. Numerical methods are used to resolve a large number of physical boundary problems independent of the model geometry. The time expended due to the computational process are related to the imposed boundary conditions and the well conformed geometry. Any geometric model that contains gaps or open lines is considered an imperfect geometry model and major commercial solver packages are incapable of handling such inputs. Others packages apply different kinds of methods to resolve this problems like patching or zippering; but the final resolved geometry may be different from the original geometry, and the changes may be unacceptable. The study proposed in this dissertation is based on a new technique to process models with geometrical imperfection without the necessity to repair or change the original geometry. An algorithm is presented that is able to analyze the imperfect geometric model with the imposed boundary conditions using a meshfree method and a distance field approximation to the boundaries. Experiments are proposed to analyze the convergence of the algorithm in imperfect models geometries and will be compared with the same models but with perfect geometries. Plotting results will be presented for further analysis and conclusions of the algorithm convergence

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the major problems in the analysis of beams with Moment of Inertia varying along their length, is to find the Fixed End Moments, Stiffness, and Carry-Over Factors. In order to determine Fixed End Moments, it is necessary to consider the non-prismatic member as integrated by a large number of small sections with constant Moment of Inertia, and to find the M/EI values for each individual section. This process takes a lot of time from Designers and Structural Engineers. The object of this thesis is to design a computer program to simplify this repetitive process, obtaining rapidly and effectively the Final Moments and Shears in continuous non-prismatic Beams. For this purpose the Column Analogy and the Moment Distribution Methods of Professor Hardy Cross have been utilized as the principles toward the methodical computer solutions. The program has been specifically designed to analyze continuous beams of a maximum of four spans of any length, integrated by symmetrical members with rectangular cross sections and with rectilinear variation of the Moment of Inertia. Any load or combination of uniform and concentrated loads must be considered. Finally sample problems will be solved with the new Computer Program and with traditional systems, to determine the accuracy and applicability of the Program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct specification of the simple location quotients in regionalizing the national direct requirements table is essential to the accuracy of regional input-output multipliers. The purpose of this research is to examine the relative accuracy of these multipliers when earnings, employment, number of establishments, and payroll data specify the simple location quotients. For each specification type, I derive a column of total output multipliers and a column of total income multipliers. These multipliers are based on the 1987 benchmark input-output accounts of the U.S. economy and 1988-1992 state of Florida data. Error sign tests, and Standardized Mean Absolute Deviation (SMAD) statistics indicate that the output multiplier estimates overestimate the output multipliers published by the Department of Commerce-Bureau of Economic Analysis (BEA) for the state of Florida. In contrast, the income multiplier estimates underestimate the BEA's income multipliers. For a given multiplier type, the Spearman-rank correlation analysis shows that the multiplier estimates and the BEA multipliers have statistically different rank ordering of row elements. The above tests also find no significant different differences, both in size and ranking distributions, among the vectors of multiplier estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to gather normative data regarding the phonological system of bilingual Creole-English children ages three and five and to compare performance to norms for English speaking children. The forty participants lived in Miami and represented low socio-economic groups. Participants were assessed using the Goldman-Fristoe Test of Articulation-2 and a Haitian Creole Picture Naming Assessment. The results indicated that the percentage of correct phonemes in Creole (M=91.6) were not significantly different when compared to the correct production of the same phonemes in English (M=92.8). Further analysis revealed that the accuracy of all phonemes was higher for the five-year (M= 90.8) as compared to the three-year-olds (M= 85) in Creole. In English, the five-year-olds performed better than the three-year-olds participants. These findings revealed patterns of phonological development in bilingual Creole/English Children similar to patterns reported in other bilingual children. This information is essential in the evaluation and treatment of this population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classification procedures, including atmospheric correction satellite images as well as classification performance utilizing calibration and validation at different levels, have been investigated in the context of a coarse land-cover classification scheme for the Pachitea Basin. Two different correction methods were tested against no correction in terms of reflectance correction towards a common response for pseudo-invariant features (PIF). The accuracy of classifications derived from each of the three methods was then assessed in a discriminant analysis using crossvalidation at pixel, polygon, region, and image levels. Results indicate that only regression adjusted images using PIFs show no significant difference between images in any of the bands. A comparison of classifications at different levels suggests though that at pixel, polygon, and region levels the accuracy of the classifications do not significantly differ between corrected and uncorrected images. Spatial patterns of land-cover were analyzed in terms of colonization history, infrastructure, suitability of the land, and landownership. The actual use of the land is driven mainly by the ability to access the land and markets as is obvious in the distribution of land cover as a function of distance to rivers and roads. When considering all rivers and roads a threshold distance at which disproportional agro-pastoral land cover switches from over represented to under represented is at about 1km. Best land use suggestions seem not to affect the choice of land use. Differences in abundance of land cover between watersheds are more prevailing than differences between colonist and indigenous groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until recently the use of biometrics was restricted to high-security environments and criminal identification applications, for economic and technological reasons. However, in recent years, biometric authentication has become part of daily lives of people. The large scale use of biometrics has shown that users within the system may have different degrees of accuracy. Some people may have trouble authenticating, while others may be particularly vulnerable to imitation. Recent studies have investigated and identified these types of users, giving them the names of animals: Sheep, Goats, Lambs, Wolves, Doves, Chameleons, Worms and Phantoms. The aim of this study is to evaluate the existence of these users types in a database of fingerprints and propose a new way of investigating them, based on the performance of verification between subjects samples. Once introduced some basic concepts in biometrics and fingerprint, we present the biometric menagerie and how to evaluate them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coral reef maps at various spatial scales and extents are needed for mapping, monitoring, modelling, and management of these environments. High spatial resolution satellite imagery, pixel <10 m, integrated with field survey data and processed with various mapping approaches, can provide these maps. These approaches have been accurately applied to single reefs (10-100 km**2), covering one high spatial resolution scene from which a single thematic layer (e.g. benthic community) is mapped. This article demonstrates how a hierarchical mapping approach can be applied to coral reefs from individual reef to reef-system scales (10-1000 km**2) using object-based image classification of high spatial resolution images guided by ecological and geomorphological principles. The approach is demonstrated for three individual reefs (10-35 km**2) in Australia, Fiji, and Palau; and for three complex reef systems (300-600 km**2) one in the Solomon Islands and two in Fiji. Archived high spatial resolution images were pre-processed and mosaics were created for the reef systems. Georeferenced benthic photo transect surveys were used to acquire cover information. Field and image data were integrated using an object-based image analysis approach that resulted in a hierarchically structured classification. Objects were assigned class labels based on the dominant benthic cover type, or location-relevant ecological and geomorphological principles, or a combination thereof. This generated a hierarchical sequence of reef maps with an increasing complexity in benthic thematic information that included: 'reef', 'reef type', 'geomorphic zone', and 'benthic community'. The overall accuracy of the 'geomorphic zone' classification for each of the six study sites was 76-82% using 6-10 mapping categories. For 'benthic community' classification, the overall accuracy was 52-75% with individual reefs having 14-17 categories and reef systems 20-30 categories. We show that an object-based classification of high spatial resolution imagery, guided by field data and ecological and geomorphological principles, can produce consistent, accurate benthic maps at four hierarchical spatial scales for coral reefs of various sizes and complexities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The auditory evoked N1m-P2m response complex presents a challenging case for MEG source-modelling, because symmetrical, phase-locked activity occurs in the hemispheres both contralateral and ipsilateral to stimulation. Beamformer methods, in particular, can be susceptible to localisation bias and spurious sources under these conditions. This study explored the accuracy and efficiency of event-related beamformer source models for auditory MEG data under typical experimental conditions: monaural and diotic stimulation; and whole-head beamformer analysis compared to a half-head analysis using only sensors from the hemisphere contralateral to stimulation. Event-related beamformer localisations were also compared with more traditional single-dipole models. At the group level, the event-related beamformer performed equally well as the single-dipole models in terms of accuracy for both the N1m and the P2m, and in terms of efficiency (number of successful source models) for the N1m. The results yielded by the half-head analysis did not differ significantly from those produced by the traditional whole-head analysis. Any localisation bias caused by the presence of correlated sources is minimal in the context of the inter-individual variability in source localisations. In conclusion, event-related beamformers provide a useful alternative to equivalent-current dipole models in localisation of auditory evoked responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many studies have shown the considerable potential for the application of remote-sensing-based methods for deriving estimates of lake water quality. However, the reliable application of these methods across time and space is complicated by the diversity of lake types, sensor configuration, and the multitude of different algorithms proposed. This study tested one operational and 46 empirical algorithms sourced from the peer-reviewed literature that have individually shown potential for estimating lake water quality properties in the form of chlorophyll-a (algal biomass) and Secchi disc depth (SDD) (water transparency) in independent studies. Nearly half (19) of the algorithms were unsuitable for use with the remote-sensing data available for this study. The remaining 28 were assessed using the Terra/Aqua satellite archive to identify the best performing algorithms in terms of accuracy and transferability within the period 2001–2004 in four test lakes, namely Vänern, Vättern, Geneva, and Balaton. These lakes represent the broad continuum of large European lake types, varying in terms of eco-region (latitude/longitude and altitude), morphology, mixing regime, and trophic status. All algorithms were tested for each lake separately and combined to assess the degree of their applicability in ecologically different sites. None of the algorithms assessed in this study exhibited promise when all four lakes were combined into a single data set and most algorithms performed poorly even for specific lake types. A chlorophyll-a retrieval algorithm originally developed for eutrophic lakes showed the most promising results (R2 = 0.59) in oligotrophic lakes. Two SDD retrieval algorithms, one originally developed for turbid lakes and the other for lakes with various characteristics, exhibited promising results in relatively less turbid lakes (R2 = 0.62 and 0.76, respectively). The results presented here highlight the complexity associated with remotely sensed lake water quality estimates and the high degree of uncertainty due to various limitations, including the lake water optical properties and the choice of methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for elemental analysis of biological matrices such as bone, teeth, and plant matter for sourcing purposes has emerged within the forensic and geochemical laboratories. Trace elemental analyses for the comparison of aterials such as glass by inductively coupled plasma mass spectrometry (ICP-MS) and laser ablation ICP-MS has been shown to offer a high degree of discrimination between different manufacturing sources. Unit resolution ICP-MS instruments may suffer from some polyatomic interferences including 40Ar16O+, 40Ar16O1H+, and 40Ca16O+ that affect iron measurement at trace levels. Iron is an important element in the analysis of glass and also of interest for the analysis of several biological matrices. A comparison of the nalytical performance of two different ICP-MS systems for iron analysis in glass for determining the method detection limits (MDLs), accuracy, and precision of the measurement is presented. Acid digestion and laser ablation methods are also compared. Iron polyatomic interferences were reduced or resolved by using dynamic reaction cell and high resolution ICP-MS. MDLs as low as 0.03 ìg g-1 and 0.14 ìg g-1 for laser ablation and solution based analyses respectively were achieved. The use of helium as a carrier gas demonstrated improvement in the detection limits of both iron isotopes (56Fe and 57Fe) in medium resolution for the HR-ICP-MS and with a dynamic reaction cell (DRC) coupled to a quadrupole ICP-MS system. The development and application of robust analytical methods for the quantification of trace elements in biological matrices has lead to a better understanding of the potential utility of these measurements in forensic chemical analyses. Standard reference materials (SRMs) were used in the development of an analytical method using HR-ICP-MS and LA-HR-ICP-MS that was subsequently applied on the analysis of real samples. Bone, teeth and ashed marijuana samples were analyzed with the developed method. Elemental analysis of bone samples from 12 different individuals provided discrimination between individuals, when femur and humerus bones were considered separately. Discrimination of 14 teeth samples based on elemental composition was achieved with the exception of one case where samples from the same individual were not associated with each other. The discrimination of 49 different ashed plant (cannabis)samples was achieved using the developed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scientists planning to use underwater stereoscopic image technologies are often faced with numerous problems during the methodological implementations: commercial equipment is too expensive; the setup or calibration is too complex; or the imaging processing (i.e. measuring objects in the stereo-images) is too complicated to be performed without a time-consuming phase of training and evaluation. The present paper addresses some of these problems and describes a workflow for stereoscopic measurements for marine biologists. It also provides instructions on how to assemble an underwater stereo-photographic system with two digital consumer cameras and gives step-by-step guidelines for setting up the hardware. The second part details a software procedure to correct stereo-image pairs for lens distortions, which is especially important when using cameras with non-calibrated optical units. The final part presents a guide to the process of measuring the lengths (or distances) of objects in stereoscopic image pairs. To reveal the applicability and the restrictions of the described systems and to test the effects of different types of camera (a compact camera and an SLR type), experiments were performed to determine the precision and accuracy of two generic stereo-imaging units: a diver-operated system based on two Olympus Mju 1030SW compact cameras and a cable-connected observatory system based on two Canon 1100D SLR cameras. In the simplest setup without any correction for lens distortion, the low-budget Olympus Mju 1030SW system achieved mean accuracy errors (percentage deviation of a measurement from the object's real size) between 10.2 and -7.6% (overall mean value: -0.6%), depending on the size, orientation and distance of the measured object from the camera. With the single lens reflex (SLR) system, very similar values between 10.1% and -3.4% (overall mean value: -1.2%) were observed. Correction of the lens distortion significantly improved the mean accuracy errors of either system. Even more, system precision (spread of the accuracy) improved significantly in both systems. Neither the use of a wide-angle converter nor multiple reassembly of the system had a significant negative effect on the results. The study shows that underwater stereophotography, independent of the system, has a high potential for robust and non-destructive in situ sampling and can be used without prior specialist training.