973 resultados para Error in substance


Relevância:

90.00% 90.00%

Publicador:

Resumo:

A free-space optical (FSO) laser communication system with perfect fast-tracking experiences random power fading due to atmospheric turbulence. For a FSO communication system without fast-tracking or with imperfect fast-tracking, the fading probability density function (pdf) is also affected by the pointing error. In this thesis, the overall fading pdfs of FSO communication system with pointing errors are calculated using an analytical method based on the fast-tracked on-axis and off-axis fading pdfs and the fast-tracked beam profile of a turbulence channel. The overall fading pdf is firstly studied for the FSO communication system with collimated laser beam. Large-scale numerical wave-optics simulations are performed to verify the analytically calculated fading pdf with collimated beam under various turbulence channels and pointing errors. The calculated overall fading pdfs are almost identical to the directly simulated fading pdfs. The calculated overall fading pdfs are also compared with the gamma-gamma (GG) and the log-normal (LN) fading pdf models. They fit better than both the GG and LN fading pdf models under different receiver aperture sizes in all the studied cases. Further, the analytical method is expanded to the FSO communication system with beam diverging angle case. It is shown that the gamma pdf model is still valid for the fast-tracked on-axis and off-axis fading pdfs with point-like receiver aperture when the laser beam is propagated with beam diverging angle. Large-scale numerical wave-optics simulations prove that the analytically calculated fading pdfs perfectly fit the overall fading pdfs for both focused and diverged beam cases. The influence of the fast-tracked on-axis and off-axis fading pdfs, the fast-tracked beam profile, and the pointing error on the overall fading pdf is also discussed. At last, the analytical method is compared with the previous heuristic fading pdf models proposed since 1970s. Although some of previously proposed fading pdf models provide close fit to the experiment and simulation data, these close fits only exist under particular conditions. Only analytical method shows accurate fit to the directly simulated fading pdfs under different turbulence strength, propagation distances, receiver aperture sizes and pointing errors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Target localization has a wide range of military and civilian applications in wireless mobile networks. Examples include battle-field surveillance, emergency 911 (E911), traffc alert, habitat monitoring, resource allocation, routing, and disaster mitigation. Basic localization techniques include time-of-arrival (TOA), direction-of-arrival (DOA) and received-signal strength (RSS) estimation. Techniques that are proposed based on TOA and DOA are very sensitive to the availability of Line-of-sight (LOS) which is the direct path between the transmitter and the receiver. If LOS is not available, TOA and DOA estimation errors create a large localization error. In order to reduce NLOS localization error, NLOS identifcation, mitigation, and localization techniques have been proposed. This research investigates NLOS identifcation for multiple antennas radio systems. The techniques proposed in the literature mainly use one antenna element to enable NLOS identifcation. When a single antenna is utilized, limited features of the wireless channel can be exploited to identify NLOS situations. However, in DOA-based wireless localization systems, multiple antenna elements are available. In addition, multiple antenna technology has been adopted in many widely used wireless systems such as wireless LAN 802.11n and WiMAX 802.16e which are good candidates for localization based services. In this work, the potential of spatial channel information for high performance NLOS identifcation is investigated. Considering narrowband multiple antenna wireless systems, two xvNLOS identifcation techniques are proposed. Here, the implementation of spatial correlation of channel coeffcients across antenna elements as a metric for NLOS identifcation is proposed. In order to obtain the spatial correlation, a new multi-input multi-output (MIMO) channel model based on rough surface theory is proposed. This model can be used to compute the spatial correlation between the antenna pair separated by any distance. In addition, a new NLOS identifcation technique that exploits the statistics of phase difference across two antenna elements is proposed. This technique assumes the phases received across two antenna elements are uncorrelated. This assumption is validated based on the well-known circular and elliptic scattering models. Next, it is proved that the channel Rician K-factor is a function of the phase difference variance. Exploiting Rician K-factor, techniques to identify NLOS scenarios are proposed. Considering wideband multiple antenna wireless systems which use MIMO-orthogonal frequency division multiplexing (OFDM) signaling, space-time-frequency channel correlation is exploited to attain NLOS identifcation in time-varying, frequency-selective and spaceselective radio channels. Novel NLOS identi?cation measures based on space, time and frequency channel correlation are proposed and their performances are evaluated. These measures represent a better NLOS identifcation performance compared to those that only use space, time or frequency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Single-screw extrusion is one of the widely used processing methods in plastics industry, which was the third largest manufacturing industry in the United States in 2007 [5]. In order to optimize the single-screw extrusion process, tremendous efforts have been devoted for development of accurate models in the last fifty years, especially for polymer melting in screw extruders. This has led to a good qualitative understanding of the melting process; however, quantitative predictions of melting from various models often have a large error in comparison to the experimental data. Thus, even nowadays, process parameters and the geometry of the extruder channel for the single-screw extrusion are determined by trial and error. Since new polymers are developed frequently, finding the optimum parameters to extrude these polymers by trial and error is costly and time consuming. In order to reduce the time and experimental work required for optimizing the process parameters and the geometry of the extruder channel for a given polymer, the main goal of this research was to perform a coordinated experimental and numerical investigation of melting in screw extrusion. In this work, a full three-dimensional finite element simulation of the two-phase flow in the melting and metering zones of a single-screw extruder was performed by solving the conservation equations for mass, momentum, and energy. The only attempt for such a three-dimensional simulation of melting in screw extruder was more than twenty years back. However, that work had only a limited success because of the capability of computers and mathematical algorithms available at that time. The dramatic improvement of computational power and mathematical knowledge now make it possible to run full 3-D simulations of two-phase flow in single-screw extruders on a desktop PC. In order to verify the numerical predictions from the full 3-D simulations of two-phase flow in single-screw extruders, a detailed experimental study was performed. This experimental study included Maddock screw-freezing experiments, Screw Simulator experiments and material characterization experiments. Maddock screw-freezing experiments were performed in order to visualize the melting profile along the single-screw extruder channel with different screw geometry configurations. These melting profiles were compared with the simulation results. Screw Simulator experiments were performed to collect the shear stress and melting flux data for various polymers. Cone and plate viscometer experiments were performed to obtain the shear viscosity data which is needed in the simulations. An optimization code was developed to optimize two screw geometry parameters, namely, screw lead (pitch) and depth in the metering section of a single-screw extruder, such that the output rate of the extruder was maximized without exceeding the maximum temperature value specified at the exit of the extruder. This optimization code used a mesh partitioning technique in order to obtain the flow domain. The simulations in this flow domain was performed using the code developed to simulate the two-phase flow in single-screw extruders.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To estimate a parameter in an elliptic boundary value problem, the method of equation error chooses the value that minimizes the error in the PDE and boundary condition (the solution of the BVP having been replaced by a measurement). The estimated parameter converges to the exact value as the measured data converge to the exact value, provided Tikhonov regularization is used to control the instability inherent in the problem. The error in the estimated solution can be bounded in an appropriate quotient norm; estimates can be derived for both the underlying (infinite-dimensional) problem and a finite-element discretization that can be implemented in a practical algorithm. Numerical experiments demonstrate the efficacy and limitations of the method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Understanding the canopy cover of an urban environment leads to better estimates of carbon storage and more informed management decisions by urban foresters. The most commonly used method for assessing urban forest cover type extent is ground surveys, which can be both timeconsuming and expensive. The analysis of aerial photos is an alternative method that is faster, cheaper, and can cover a larger number of sites, but may be less accurate. The objectives of this paper were (1) to compare three methods of cover type assessment for Los Angeles, CA: handdelineation of aerial photos in ArcMap, supervised classification of aerial photos in ERDAS Imagine, and ground-collected data using the Urban Forest Effects (UFORE) model protocol; (2) to determine how well remote sensing methods estimate carbon storage as predicted by the UFORE model; and (3) to explore the influence of tree diameter and tree density on carbon storage estimates. Four major cover types (bare ground, fine vegetation, coarse vegetation, and impervious surfaces) were determined from 348 plots (0.039 ha each) randomly stratified according to land-use. Hand-delineation was better than supervised classification at predicting ground-based measurements of cover type and UFORE model-predicted carbon storage. Most error in supervised classification resulted from shadow, which was interpreted as unknown cover type. Neither tree diameter or tree density per plot significantly affected the relationship between carbon storage and canopy cover. The efficiency of remote sensing rather than in situ data collection allows urban forest managers the ability to quickly assess a city and plan accordingly while also preserving their often-limited budget.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In-cylinder pressure transducers have been used for decades to record combustion pressure inside a running engine. However, due to the extreme operating environment, transducer design and installation must be considered in order to minimize measurement error. One such error is caused by thermal shock, where the pressure transducer experiences a high heat flux that can distort the pressure transducer diaphragm and also change the crystal sensitivity. This research focused on investigating the effects of thermal shock on in-cylinder pressure transducer data quality using a 2.0L, four-cylinder, spark-ignited, direct-injected, turbo-charged GM engine. Cylinder four was modified with five ports to accommodate pressure transducers of different manufacturers. They included an AVL GH14D, an AVL GH15D, a Kistler 6125C, and a Kistler 6054AR. The GH14D, GH15D, and 6054AR were M5 size transducers. The 6125C was a larger, 6.2mm transducer. Note that both of the AVL pressure transducers utilized a PH03 flame arrestor. Sweeps of ignition timing (spark sweep), engine speed, and engine load were performed to study the effects of thermal shock on each pressure transducer. The project consisted of two distinct phases which included experimental engine testing as well as simulation using a commercially available software package. A comparison was performed to characterize the quality of the data between the actual cylinder pressure and the simulated results. This comparison was valuable because the simulation results did not include thermal shock effects. All three sets of tests showed the peak cylinder pressure was basically unaffected by thermal shock. Comparison of the experimental data with the simulated results showed very good correlation. The spark sweep was performed at 1300 RPM and 3.3 bar NMEP and showed that the differences between the simulated results (no thermal shock) and the experimental data for the indicated mean effective pressure (IMEP) and the pumping mean effective pressure (PMEP) were significantly less than the published accuracies. All transducers had an IMEP percent difference less than 0.038% and less than 0.32% for PMEP. Kistler and AVL publish that the accuracy of their pressure transducers are within plus or minus 1% for the IMEP (AVL 2011; Kistler 2011). In addition, the difference in average exhaust absolute pressure between the simulated results and experimental data was the greatest for the two Kistler pressure transducers. The location and lack of flame arrestor are believed to be the cause of the increased error. For the engine speed sweep, the torque output was held constant at 203 Nm (150 ft-lbf) from 1500 to 4000 RPM. The difference in IMEP was less than 0.01% and the PMEP was less than 1%, except for the AVL GH14D which was 5% and the AVL GH15DK which was 2.25%. A noticeable error in PMEP appeared as the load increased during the engine speed sweeps, as expected. The load sweep was conducted at 2000 RPM over a range of NMEP from 1.1 to 14 bar. The difference in IMEP values were less 0.08% while the PMEP values were below 1% except for the AVL GH14D which was 1.8% and the AVL GH15DK which was at 1.25%. In-cylinder pressure transducer data quality was effectively analyzed using a combination of experimental data and simulation results. Several criteria can be used to investigate the impact of thermal shock on data quality as well as determine the best location and thermal protection for various transducers.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The performance of memory-guided saccades with two different delays (3 s and 30 s of memorisation) was studied in eight subjects. Single pulse transcranial magnetic stimulation (TMS) was applied simultaneously over the left and right dorsolateral prefrontal cortex (DLPFC) 1 s after target presentation. In both delays, stimulation significantly increased the percentage of error in amplitude of memory-guided saccades. Furthermore, the interfering effect of TMS was significantly higher in the short delay compared to that of the long delay paradigm. The results are discussed in the context of a mixed model of spatial working memory control including two components: First, serial information processing with a predominant role of the DLPFC during the early period of memorisation and, second, parallel information processing, which is independent from the DLPFC, operating during longer delays.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to analyse the possible basis of subjective complaints following whiplash injury, horizontal eye movements were examined in subjects with persistent complaints ('symptomatic group') and subjects who had completely recovered ('recovered group'). The results for the symptomatic and recovered groups were compared with those for age-matched, healthy volunteers (control group). A battery of different saccade paradigms was employed: two were reflexive saccade tasks including a gap and an overlap task, and two were intentional saccade tasks consisting of an antisaccade and a memory-guided saccade task. In addition, the symptomatic and recovered groups also underwent psychiatric evaluation in a structured clinical interview, and all groups were assessed for emotional functioning using the Beck Depression Inventory (BDI). The recovered group did not differ significantly from the control group in saccade performance and emotional functioning. The symptomatic group showed dissociation of their performances of reflexive and intentional saccade tasks: performance in reflexive saccade tasks was normal, but in intentional saccade tasks the symptomatic group showed significantly impaired inhibition of unwanted reflexive saccades, impaired saccade triggering (i.e. increased latency) and a higher percentage error in amplitude in memory-guided saccades. Based on clinical interviews, no signs of major depression or dysthymia were found in any of the groups. Compared with the other two groups, the symptomatic group had significantly higher overall BDI scores, but these resulted from BDI dimensions that were non-specific to depression, viz. 'physiological manifestations' (e.g. fatigue, sleep disturbance) or 'performance difficulty' (e.g. work inhibition). In summary, in the symptomatic group the pattern of eye movement disturbances together with normal performance in reflexive saccade tasks and impaired performance in the intentional saccade tasks, especially impaired inhibitory function, suggests dysfunction of prefrontal and frontal cortical structures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chronological inconsistency in the first book of Matthew of Edessa's Chronicle is well-known. Some of his are dates accurate to the day, while others err by up to 50 years, with no immediately apparent pattern of error. In this talk I will examine some of these chronological puzzles, by untangling the five main themes that run through the book. By demonstrating how these chronological errors may have arisen, and why certain events in the chronicle are dated much more accurately than others, light may be shed on the sources and methodologies that Matthew used to compose his chronicle.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

For more than 15 years, patient safety has been an issue in different domains of medicine. There is evidence for this subject and also a great need for information. First, we should be familiar with the basic terminology such as the relationship between adverse events and errors, and understand the variations of error. In patient management, besides skills and knowledge (evidence-based medicine), the ability (competence) of healthcare professionals to act and react in unexpected situations is key to prevent and treat adverse events. Not only healthcare professionals should be involved in the process but also healthy people in a way that they understand and patients in a way that they are actively involved. This paper will show how a more general view of patient safety can and should be implemented in the daily work of caregivers dealing with dialysis access in different aspects. A key factor to advance in this subject is to be open-minded and sensualized for this topic. The reader should get an idea of how an institution can create a culture of safety.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Studies have shown that the discriminability of successive time intervals depends on the presentation order of the standard (St) and the comparison (Co) stimuli. Also, this order affects the point of subjective equality. The first effect is here called the standard-position effect (SPE); the latter is known as the time-order error. In the present study, we investigated how these two effects vary across interval types and standard durations, using Hellström’s sensation-weighting model to describe the results and relate them to stimulus comparison mechanisms. In Experiment 1, four modes of interval presentation were used, factorially combining interval type (filled, empty) and sensory modality (auditory, visual). For each mode, two presentation orders (St–Co, Co–St) and two standard durations (100 ms, 1,000 ms) were used; half of the participants received correctness feedback, and half of them did not. The interstimulus interval was 900 ms. The SPEs were negative (i.e., a smaller difference limen for St–Co than for Co–St), except for the filled-auditory and empty-visual 100-ms standards, for which a positive effect was obtained. In Experiment 2, duration discrimination was investigated for filled auditory intervals with four standards between 100 and 1,000 ms, an interstimulus interval of 900 ms, and no feedback. Standard duration interacted with presentation order, here yielding SPEs that were negative for standards of 100 and 1,000 ms, but positive for 215 and 464 ms. Our findings indicate that the SPE can be positive as well as negative, depending on the interval type and standard duration, reflecting the relative weighting of the stimulus information, as is described by the sensation-weighting model.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Few studies have investigated causal pathways linking psychosocial factors to each other and to screening mammography. Conflicting hypotheses exist in the theoretic literature regarding the role and importance of subjective norms, a person's perceived social pressure to perform the behavior and his/her motivation to comply. The Theory of Reasoned Action (TRA) hypothesizes that subjective norms directly affect intention; while the Transtheoretical Model (TTM) hypothesizes that attitudes mediate the influence of subjective norms on stage of change. No one has examined which hypothesis best predicts the effect of subjective norms on mammography intention and stage of change. Two statistical methods are available for testing mediation, sequential regression analysis (SRA) and latent variable structural equation modeling (LVSEM); however, software to apply LVSEM to dichotomous variables like intention has only recently become available. No one has compared the methods to determine whether or not they yield similar results for dichotomous variables. ^ Study objectives were to: (1) determine whether the effect of subjective norms on mammography intention and stage of change are mediated by pros and cons; and (2) compare mediation results from the SRA and LVSEM approaches when the outcome is dichotomous. We conducted a secondary analysis of data from a national sample of women veterans enrolled in Project H.O.M.E. (H&barbelow;ealthy O&barbelow;utlook on the M&barbelow;ammography E&barbelow;xperience), a behavioral intervention trial. ^ Results showed that the TTM model described the causal pathways better than the TRA one; however, we found support for only one of the TTM causal mechanisms. Cons was the sole mediator. The mediated effect of subjective norms on intention and stage of change by cons was very small. These findings suggest that interventionists focus their efforts on reducing negative attitudes toward mammography when resources are limited. ^ Both the SRA and LVSEM methods provided evidence for complete mediation, and the direction, magnitude, and standard errors of the parameter estimates were very similar. Because SRA parameter estimates were not biased toward the null, we can probably assume negligible measurement error in the independent and mediator variables. Simulation studies are needed to further our understanding of how these two methods perform under different data conditions. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Each year, hospitalized patients experience 1.5 million preventable injuries from medication errors and hospitals incur an additional $3.5 billion in cost (Aspden, Wolcott, Bootman, & Cronenwatt; (2007). It is believed that error reporting is one way to learn about factors contributing to medication errors. And yet, an estimated 50% of medication errors go unreported. This period of medication error pre-reporting, with few exceptions, is underexplored. The literature focuses on error prevention and management, but lacks a description of the period of introspection and inner struggle over whether to report an error and resulting likelihood to report. Reporting makes a nurse vulnerable to reprimand, legal liability, and even threat to licensure. For some nurses this state may invoke a disparity between a person‘s belief about him or herself as a healer and the undeniable fact of the error.^ This study explored the medication error reporting experience. Its purpose was to inform nurses, educators, organizational leaders, and policy-makers about the medication error pre-reporting period, and to contribute to a framework for further investigation. From a better understanding of factors that contribute to or detract from the likelihood of an individual to report an error, interventions can be identified to help the nurse come to a psychologically healthy resolution and help increase reporting of error in order to learn from error and reduce the possibility of future similar error.^ The research question was: "What factors contribute to a nurse's likelihood to report an error?" The specific aims of the study were to: (1) describe participant nurses' perceptions of medication error reporting; (2) describe participant explanations of the emotional, cognitive, and physical reactions to making a medication error; (3) identify pre-reporting conditions that make it less likely for a nurse to report a medication error; and (4) identify pre-reporting conditions that make it more likely for a nurse to report a medication error.^ A qualitative research study was conducted to explore the medication error experience and in particular the pre-reporting period from the perspective of the nurse. A total of 54 registered nurses from a large private free-standing not-for-profit children's hospital in the southwestern United States participated in group interviews. The results describe the experience of the nurse as well as the physical, emotional, and cognitive responses to the realization of the commission of a medication error. The results also reveal factors that make it more and less likely to report a medication error.^ It is clear from this study that upon realization that he or she has made a medication error, a nurse's foremost concern is for the safety of the patient. Fear was also described by each group of nurses. The nurses described a fear of several things including physician reaction, manager reaction, peer reaction, as well as family reaction and possible lack of trust as a result. Another universal response was the description of a struggle with guilt, shame, imperfection, blaming oneself, and questioning one's competence.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-center clinical trials are very common in the development of new drugs and devices. One concern in such trials, is the effect of individual investigational sites enrolling small numbers of patients on the overall result. Can the presence of small centers cause an ineffective treatment to appear effective when treatment-by-center interaction is not statistically significant?^ In this research, simulations are used to study the effect that centers enrolling few patients may have on the analysis of clinical trial data. A multi-center clinical trial with 20 sites is simulated to investigate the effect of a new treatment in comparison to a placebo treatment. Twelve of these 20 investigational sites are considered small, each enrolling less than four patients per treatment group. Three clinical trials are simulated with sample sizes of 100, 170 and 300. The simulated data is generated with various characteristics, one in which treatment should be considered effective and another where treatment is not effective. Qualitative interactions are also produced within the small sites to further investigate the effect of small centers under various conditions.^ Standard analysis of variance methods and the "sometimes-pool" testing procedure are applied to the simulated data. One model investigates treatment and center effect and treatment-by-center interaction. Another model investigates treatment effect alone. These analyses are used to determine the power to detect treatment-by-center interactions, and the probability of type I error.^ We find it is difficult to detect treatment-by-center interactions when only a few investigational sites enrolling a limited number of patients participate in the interaction. However, we find no increased risk of type I error in these situations. In a pooled analysis, when the treatment is not effective, the probability of finding a significant treatment effect in the absence of significant treatment-by-center interaction is well within standard limits of type I error. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new site with Lateglacial palaeosols covered by 0.8 - 2.4 m thick aeolian sands is presented. The buried soils were subjected to multidisciplinary analyses (pedology, micromorphology, geochronology, dendrology, palynology, macrofossils). The buried soil cover comprises a catena from relatively dry ('Nano'-Podzol, Arenosol) via moist (Histic Gleysol, Gleysol) to wet conditions (Histosol). Dry soils are similar to the so-called Usselo soil, as described from sites in NW Europe and central Poland. The buried soil surface covers ca. 3.4 km**2. Pollen analyses date this surface into the late Aller0d. Due to a possible contamination by younger carbon, radiocarbon dates are too young. OSL dates indicate that the covering by aeolian sands most probably occurred during the Younger Dryas. Botanical analyses enables the reconstruction of a vegetation pattern typical for the late Allerod. Large wooden remains of pine and birch were recorded.