832 resultados para accuracy analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a modification to the ACI 318-02 equivalent frame method of analysis of reinforced concrete flat plate exterior panels. Two existing code methods were examined: ACI 318 and BS 8110. The derivation of the torsional stiffness of the edge strip as proposed by ACI 318 is examined and a more accurate estimate of this value is proposed, based on both theoretical analysis and experimental results. A series of 1/3-scale models of flat plate exterior panels have been tested. Unique experimental results were obtained by measuring strains in reinforcing bars at approximately 200 selected locations in the plate panel throughout the entire loading history. The measured strains were used to calculate curvature and, hence, bending moments; these were used along with moments in the columns to assess the accuracy of the equivalent frame methods. The proposed method leads to a more accurate prediction of the moments in the plate at the column front face, at the panel midspan, and in the edge column. Registered Subscribers: View the full article. This document is available as a free download to qualified members. An electronic (PDF) version is available for purchase and download. Click on the Order Now button to continue with the download.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to goal recognition based on a two-stage paradigm of graph construction and analysis. First, a graph structure called a Goal Graph is constructed to represent the observed actions, the state of the world, and the achieved goals as well as various connections between these nodes at consecutive time steps. Then, the Goal Graph is analysed at each time step to recognise those partially or fully achieved goals that are consistent with the actions observed so far. The Goal Graph analysis also reveals valid plans for the recognised goals or part of these goals. Our approach to goal recognition does not need a plan library. It does not suffer from the problems in the acquisition and hand-coding of large plan libraries, neither does it have the problems in searching the plan space of exponential size. We describe two algorithms for Goal Graph construction and analysis in this paradigm. These algorithms are both provably sound, polynomial-time, and polynomial-space. The number of goals recognised by our algorithms is usually very small after a sequence of observed actions has been processed. Thus the sequence of observed actions is well explained by the recognised goals with little ambiguity. We have evaluated these algorithms in the UNIX domain, in which excellent performance has been achieved in terms of accuracy, efficiency, and scalability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and purpose: Currently, optimal use of virtual simulation for all treatment sites is not entirely clear. This study presents data to identify specific patient groups for whom conventional simulation may be completely eliminated and replaced by virtual simulation. Sampling and method: Two hundred and sixty patients were recruited from four treatment sites (head and neck, breast, pelvis, and thorax). Patients were randomly assigned to be treated using the usual treatment process involving conventional simulation, or a treatment process differing only in the replacement of conventional plan verification with virtual verification. Data were collected on set-up accuracy at verification, and the number of unsatisfactory verifications requiring a return to the conventional simulator. A micro-economic costing analysis was also undertaken, whereby data for each treatment process episode were also collected: number and grade of staff present, and the time for each treatment episode. Results: The study shows no statistically significant difference in the number of returns to the conventional simulator for each site and study arm. Image registration data show similar quality of verification for each study arm. The micro-costing data show no statistical difference between the virtual and conventional simulation processes. Conclusions: At our institution, virtual simulation including virtual verification for the sites investigated presents no disadvantage compared to conventional simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. To examine internal consistency, refine the response scale, and obtain a linear scoring system for the visual function instrument, the Daily Living Tasks Dependent on Vision (DLTV). METHODS. Data were available from 186 participants with a clinical diagnosis of AMD who completed the 22-item DLTV (DLTV-22) according to four-point ordinal response scale. An independent group of 386 participants with AMD were administered a reduced version of the DLTV with 11 items (DLTV-11), according to a five-point response scale. Rasch analysis was performed on both datasets and used to generate item statistics for measure order, response odds ratios per item and per person, and infit and outfit mean square statistics. The Rasch output from the DLTV-22 was examined to identify redundant items and for factorial validity and person item measure separation reliabilities. RESULTS. The average rating for the DLTV-22 changed monotonically with the magnitude of the latent person trait. The expected versus observed average measures were extremely close, with step calibrations evenly separated for the four-point ordinal scale. In the case of the DLTV-11, step calibrations were not as evenly separated, suggesting that the five-point scale should be reduced to either a four- or three-point scale. Five items in the DLTV-22 were removed, and all 17 remaining items had good infit and outfit mean squares. PCA with residuals from Rasch analysis identified two domains containing 7 and 10 items each. The domains had high person separation reliabilities (0.86 and 0.77 for domains 1 and 2, respectively) and item measure reliabilities (0.99 and 0.98 for domains 1 and 2, respectively). CONCLUSIONS. With the improved internal consistency, establishment of the accuracy and precision of the rating scale for the DLTV and the establishment of a valid domain structure we believe that it constitutes a useful instrument for assessing visual function in older adults with age-related macular degeneration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For protons of energy up to a few MeV, the temporal evolution of etched latent tracks in CR-39 nuclear track detector has been numerically modeled by assuming that the electronic energy loss of the protons governs the latent track formation. The technique is applied in order to obtain the energy spectrum of high intensity laser driven proton beams, with high accuracy. The precise measurement of the track length and areal track density have been achieved by scanning short etched, highly populated CR-39 employing atomic force microscope.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A study was performed to determine if targeted metabolic profiling of cattle sera could be used to establish a predictive tool for identifying hormone misuse in cattle. Metabolites were assayed in heifers (n ) 5) treated with nortestosterone decanoate (0.85 mg/kg body weight), untreated heifers (n ) 5), steers (n ) 5) treated with oestradiol benzoate (0.15 mg/kg body weight) and untreated steers (n ) 5). Treatments were administered on days 0, 14, and 28 throughout a 42 day study period. Two support vector machines (SVMs) were trained, respectively, from heifer and steer data to identify hormonetreated animals. Performance of both SVM classifiers were evaluated by sensitivity and specificity of treatment prediction. The SVM trained on steer data achieved 97.33% sensitivity and 93.85% specificity while the one on heifer data achieved 94.67% sensitivity and 87.69% specificity. Solutions of SVM classifiers were further exploited to determine those days when classification accuracy of the SVM was most reliable. For heifers and steers, days 17-35 were determined to be the most selective. In summary, bioinformatics applied to targeted metabolic profiles generated from standard clinical chemistry analyses, has yielded an accurate, inexpensive, high-throughput test for predicting steroid abuse in cattle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces the application of linear multivariate statistical techniques, including partial least squares (PLS), canonical correlation analysis (CCA) and reduced rank regression (RRR), into the area of Systems Biology. This new approach aims to extract the important proteins embedded in complex signal transduction pathway models.The analysis is performed on a model of intracellular signalling along the janus-associated kinases/signal transducers and transcription factors (JAK/STAT) and mitogen activated protein kinases (MAPK) signal transduction pathways in interleukin-6 (IL6) stimulated hepatocytes, which produce signal transducer and activator of transcription factor 3 (STAT3).A region of redundancy within the MAPK pathway that does not affect the STAT3 transcription was identified using CCA. This is the core finding of this analysis and cannot be obtained by inspecting the model by eye. In addition, RRR was found to isolate terms that do not significantly contribute to changes in protein concentrations, while the application of PLS does not provide such a detailed picture by virtue of its construction.This analysis has a similar objective to conventional model reduction techniques with the advantage of maintaining the meaning of the states prior to and after the reduction process. A significant model reduction is performed, with a marginal loss in accuracy, offering a more concise model while maintaining the main influencing factors on the STAT3 transcription.The findings offer a deeper understanding of the reaction terms involved, confirm the relevance of several proteins to the production of Acute Phase Proteins and complement existing findings regarding cross-talk between the two signalling pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is proposed to accelerate the evaluation of the Green's function of an infinite double periodic array of thin wire antennas. The method is based on the expansion of the Green's function into series corresponding to the propagating and evanescent waves and the use of Poisson and Kummer transformations enhanced with the analytic summation of the slowly convergent asymptotic terms. Unlike existing techniques the procedure reported here provides uniform convergence regardless of the geometrical parameters of the problem or plane wave excitation wavelength. In addition, it is numerically stable and does not require numerical integration or internal tuning parameters, since all necessary series are directly calculated in terms of analytical functions. This means that for nonlinear problem scenarios that the algorithm can be deployed without run time intervention or recursive adjustment within a harmonic balance engine. Numerical examples are provided to illustrate the efficiency and accuracy of the developed approach as compared with the Ewald method for which these classes of problems requires run time splitting parameter adaptation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interactions of ions in the solid state for a series of representative 1,3-dialkylimidazolium hexafluorophosphate salts (either ionic liquids or closely related) have been examined by crystallographic analysis, combined with the theoretical estimation of crystal-packing densities and lattice-interaction energies. Efficient close-packing of the ions in the crystalline states is observed, but there was no compelling evidence for specific directional hydrogen-bonding to the hexafluorophosphate anions or the formation of interstitial voids. The close-packing efficiency is supported by the theoretical calculation of ion volumes, crystal lattice energies, and packing densities, which correlated well with experimental data. The crystal density of the salts can be predicted accurately from the summation of free ion volumes and lattice energies calculated. Of even more importance for future work, on these and related salts, the solid-state density of 1,3-dialkylimidazolium hexafluorophosphate salts can be predicted with reasonable accuracy purely on the basis of on ab initio free ion volumes, and this allows prediction of lattice energies without necessarily requiring the crystal structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new technique for palmprint recognition based on Fisher Linear Discriminant Analysis (FLDA) and Gabor filter bank. This method involves convolving a palmprint image with a bank of Gabor filters at different scales and rotations for robust palmprint features extraction. Once these features are extracted, FLDA is applied for dimensionality reduction and class separability. Since the palmprint features are derived from the principal lines, wrinkles and texture along the palm area. One should carefully consider this fact when selecting the appropriate palm region for the feature extraction process in order to enhance recognition accuracy. To address this problem, an improved region of interest (ROI) extraction algorithm is introduced. This algorithm allows for an efficient extraction of the whole palm area by ignoring all the undesirable parts, such as the fingers and background. Experiments have shown that the proposed method yields attractive performances as evidenced by an Equal Error Rate (EER) of 0.03%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study concerns the spatial allocation of material flows, with emphasis on construction material in the Irish housing sector. It addresses some of the key issues concerning anthropogenic impact on the environment through spatial temporal visualisation of the flow of materials, wastes and emissions at different spatial levels. This is presented in the form of a spatial model, Spatial Allocation of Material Flow Analysis (SAMFA), which enables the simulation of construction material flows and associated energy use. SAMFA parallels the Island Limits project (EPA funded under 2004-SD-MS-22-M2), which aimed to create a material flow analysis of the Irish economy classified by industrial sector. SAMFA further develops this by attempting to establish the material flows at the subnational geographical scale that could be used in the development of local authority (LA) sustainability strategies and spatial planning frameworks by highlighting the cumulative environmental impacts of the development of the built environment. By drawing on the idea of planning support systems, SAMFA also aims to provide a cross-disciplinary, integrative medium for involving stakeholders in strategies for a sustainable built environment and, as such, would help illustrate the sustainability consequences of alternative The pilot run of the model in Kildare has shown that the model can be successfully calibrated and applied to develop alternative material flows and energy-use scenarios at the ED level. This has been demonstrated through the development of an integrated and a business-as-usual scenario, with the former integrating a range of potential material efficiency and energysaving policy options and the latter replicating conditions that best describe the current trend. Their comparison shows that the former is better than the latter in terms of both material and energy use. This report also identifies a number of potential areas of future research and areas of broader application. This includes improving the accuracy of the SAMFA model (e.g. by establishing actual life expectancy of buildings in the Irish context through field surveys) and the extension of the model to other Irish counties. This would establish SAMFA as a valuable predicting and monitoring tool that is capable of integrating national and local spatial planning objectives with actual environmental impacts. Furthermore, should the model prove successful at this level, it then has the potential to transfer the modelling approach to other areas of the built environment, such as commercial development and other key contributors of greenhouse emissions. The ultimate aim is to develop a meta-model for predicting the consequences of consumption patterns at the local scale. This therefore offers the possibility of creating critical links between socio technical systems with the most important challenge of all the limitations of the biophysical environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voice over IP (VoIP) has experienced a tremendous growth over the last few years and is now widely used among the population and for business purposes. The security of such VoIP systems is often assumed, creating a false sense of privacy. This paper investigates in detail the leakage of information from Skype, a widely used and protected VoIP application. Experiments have shown that isolated phonemes can be classified and given sentences identified. By using the dynamic time warping (DTW) algorithm, frequently used in speech processing, an accuracy of 60% can be reached. The results can be further improved by choosing specific training data and reach an accuracy of 83% under specific conditions. The initial results being speaker dependent, an approach involving the Kalman filter is proposed to extract the kernel of all training signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ambisonics and higher order ambisonics (HOA) technologies aim at reproducing sound field either synthesised or previously recorded with dedicated microphones. Based on a spherical harmonic decomposition, the sound field is more precisely described when higher-order components are used. The presented study evaluated the perceptual and objective localisation accuracy of the sound field encoded with four microphones of order one to four and decoded over a ring of loudspeakers. A perceptual test showed an improvement of the localisation with higher order ambisonic microphones. Reproduced localisation indices were estimated for the four microphones and the respective synthetic systems of order one to four. The perceptual and objective analysis revealed the same conclusions. The localisation accuracy depends on the ambisonic order as well as the source incidence. Furthermore, impairments linked to the microphones were highlighted.