966 resultados para Algebra of Errors


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The spectral reflectance of the sea surface recorded using ocean colour satellite sensors has been used to estimate chlorophyll-a concentrations for decades. However, in bio-optically complex coastal waters, these estimates are compromised by the presence of several other coloured components besides chlorophyll, especially in regions affected by low-salinity waters. The present work aims to (a) describe the influence of the freshwater plume from the La Plata River on the variability of in situ remote sensing reflectance and (b) evaluate the performance of operational ocean colour chlorophyll algorithms applied to Southwestern Atlantic waters, which receive a remarkable seasonal contribution from La Plata River discharges. Data from three oceanographic cruises are used, in addition to a historical regional bio-optical dataset. Deviations found between measured and estimated concentrations of chlorophyll-a are examined in relation to surface water salinity and turbidity gradients to investigate the source of errors in satellite estimates of pigment concentrations. We observed significant seasonal variability in surface reflectance properties that are strongly driven by La Plata River plume dynamics and arise from the presence of high levels of inorganic suspended solids and coloured dissolved materials. As expected, existing operational algorithms overestimate the concentration of chlorophyll-a, especially in waters of low salinity (S<33.5) and high turbidity (Rrs(670)>0.0012 sr−1). Additionally, an updated version of the regional algorithm is presented, which clearly improves the chlorophyll estimation in those types of coastal environment. In general, the techniques presented here allow us to directly distinguish the bio-optical types of waters to be considered in algorithm studies by the ocean colour community.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The quality of temperature and humidity retrievals from the infrared SEVIRI sensors on the geostationary Meteosat Second Generation (MSG) satellites is assessed by means of a one dimensional variational algorithm. The study is performed with the aim of improving the spatial and temporal resolution of available observations to feed analysis systems designed for high resolution regional scale numerical weather prediction (NWP) models. The non-hydrostatic forecast model COSMO (COnsortium for Small scale MOdelling) in the ARPA-SIM operational configuration is used to provide background fields. Only clear sky observations over sea are processed. An optimised 1D–VAR set-up comprising of the two water vapour and the three window channels is selected. It maximises the reduction of errors in the model backgrounds while ensuring ease of operational implementation through accurate bias correction procedures and correct radiative transfer simulations. The 1D–VAR retrieval quality is firstly quantified in relative terms employing statistics to estimate the reduction in the background model errors. Additionally the absolute retrieval accuracy is assessed comparing the analysis with independent radiosonde and satellite observations. The inclusion of satellite data brings a substantial reduction in the warm and dry biases present in the forecast model. Moreover it is shown that the retrieval profiles generated by the 1D–VAR are well correlated with the radiosonde measurements. Subsequently the 1D–VAR technique is applied to two three–dimensional case–studies: a false alarm case–study occurred in Friuli–Venezia–Giulia on the 8th of July 2004 and a heavy precipitation case occurred in Emilia–Romagna region between 9th and 12th of April 2005. The impact of satellite data for these two events is evaluated in terms of increments in the integrated water vapour and saturation water vapour over the column, in the 2 meters temperature and specific humidity and in the surface temperature. To improve the 1D–VAR technique a method to calculate flow–dependent model error covariance matrices is also assessed. The approach employs members from an ensemble forecast system generated by perturbing physical parameterisation schemes inside the model. The improved set–up applied to the case of 8th of July 2004 shows a substantial neutral impact.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a non linear technique to invert strong motion records with the aim of obtaining the final slip and rupture velocity distributions on the fault plane. In this thesis, the ground motion simulation is obtained evaluating the representation integral in the frequency. The Green’s tractions are computed using the discrete wave-number integration technique that provides the full wave-field in a 1D layered propagation medium. The representation integral is computed through a finite elements technique, based on a Delaunay’s triangulation on the fault plane. The rupture velocity is defined on a coarser regular grid and rupture times are computed by integration of the eikonal equation. For the inversion, the slip distribution is parameterized by 2D overlapping Gaussian functions, which can easily relate the spectrum of the possible solutions with the minimum resolvable wavelength, related to source-station distribution and data processing. The inverse problem is solved by a two-step procedure aimed at separating the computation of the rupture velocity from the evaluation of the slip distribution, the latter being a linear problem, when the rupture velocity is fixed. The non-linear step is solved by optimization of an L2 misfit function between synthetic and real seismograms, and solution is searched by the use of the Neighbourhood Algorithm. The conjugate gradient method is used to solve the linear step instead. The developed methodology has been applied to the M7.2, Iwate Nairiku Miyagi, Japan, earthquake. The estimated magnitude seismic moment is 2.6326 dyne∙cm that corresponds to a moment magnitude MW 6.9 while the mean the rupture velocity is 2.0 km/s. A large slip patch extends from the hypocenter to the southern shallow part of the fault plane. A second relatively large slip patch is found in the northern shallow part. Finally, we gave a quantitative estimation of errors associates with the parameters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The main part of this thesis describes a method of calculating the massless two-loop two-point function which allows expanding the integral up to an arbitrary order in the dimensional regularization parameter epsilon by rewriting it as a double Mellin-Barnes integral. Closing the contour and collecting the residues then transforms this integral into a form that enables us to utilize S. Weinzierl's computer library nestedsums. We could show that multiple zeta values and rational numbers are sufficient for expanding the massless two-loop two-point function to all orders in epsilon. We then use the Hopf algebra of Feynman diagrams and its antipode, to investigate the appearance of Riemann's zeta function in counterterms of Feynman diagrams in massless Yukawa theory and massless QED. The class of Feynman diagrams we consider consists of graphs built from primitive one-loop diagrams and the non-planar vertex correction, where the vertex corrections only depend on one external momentum. We showed the absence of powers of pi in the counterterms of the non-planar vertex correction and diagrams built by shuffling it with the one-loop vertex correction. We also found the invariance of some coefficients of zeta functions under a change of momentum flow through these vertex corrections.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is focused on the study of techniques that allow to have reliable transmission of multimedia content in streaming and broadcasting applications, targeting in particular video content. The design of efficient error-control mechanisms, to enhance video transmission systems reliability, has been addressed considering cross-layer and multi-layer/multi-dimensional channel coding techniques to cope with bit errors as well as packet erasures. Mechanisms for unequal time interleaving have been designed as a viable solution to reduce the impact of errors and erasures by acting on the time diversity of the data flow, thus enhancing robustness against correlated channel impairments. In order to account for the nature of the factors which affect the physical layer channel in the evaluation of FEC schemes performances, an ad-hoc error-event modeling has been devised. In addition, the impact of error correction/protection techniques on the quality perceived by the consumers of video services applications and techniques for objective/subjective quality evaluation have been studied. The applicability and value of the proposed techniques have been tested by considering practical constraints and requirements of real system implementations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Modern imaging technologies, such as computed tomography (CT) techniques, represent a great challenge in forensic pathology. The field of forensics has experienced a rapid increase in the use of these new techniques to support investigations on critical cases, as indicated by the implementation of CT scanning by different forensic institutions worldwide. Advances in CT imaging techniques over the past few decades have finally led some authors to propose that virtual autopsy, a radiological method applied to post-mortem analysis, is a reliable alternative to traditional autopsy, at least in certain cases. The authors investigate the occurrence and the causes of errors and mistakes in diagnostic imaging applied to virtual autopsy. A case of suicide by a gunshot wound was submitted to full-body CT scanning before autopsy. We compared the first examination of sectional images with the autopsy findings and found a preliminary misdiagnosis in detecting a peritoneal lesion by gunshot wound that was due to radiologist's error. Then we discuss a new emerging issue related to the risk of diagnostic failure in virtual autopsy due to radiologist's error that is similar to what occurs in clinical radiology practice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Physiological data obtained with the pulmonary artery catheter (PAC) are susceptible to errors in measurement and interpretation. Little attention has been paid to the relevance of errors in hemodynamic measurements performed in the intensive care unit (ICU). The aim of this study was to assess the errors related to the technical aspects (zeroing and reference level) and actual measurement (curve interpretation) of the pulmonary artery occlusion pressure (PAOP). METHODS: Forty-seven participants in a special ICU training program and 22 ICU nurses were tested without pre-announcement. All participants had previously been exposed to the clinical use of the method. The first task was to set up a pressure measurement system for PAC (zeroing and reference level) and the second to measure the PAOP. RESULTS: The median difference from the reference mid-axillary zero level was - 3 cm (-8 to + 9 cm) for physicians and -1 cm (-5 to + 1 cm) for nurses. The median difference from the reference PAOP was 0 mmHg (-3 to 5 mmHg) for physicians and 1 mmHg (-1 to 15 mmHg) for nurses. When PAOP values were adjusted for the differences from the reference transducer level, the median differences from the reference PAOP values were 2 mmHg (-6 to 9 mmHg) for physicians and 2 mmHg (-6 to 16 mmHg) for nurses. CONCLUSIONS: Measurement of the PAOP is susceptible to substantial error as a result of practical mistakes. Comparison of results between ICUs or practitioners is therefore not possible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Medical errors originating in health care facilities are a significant source of preventable morbidity, mortality, and healthcare costs. Voluntary error report systems that collect information on the causes and contributing factors of medi- cal errors regardless of the resulting harm may be useful for developing effective harm prevention strategies. Some patient safety experts question the utility of data from errors that did not lead to harm to the patient, also called near misses. A near miss (a.k.a. close call) is an unplanned event that did not result in injury to the patient. Only a fortunate break in the chain of events prevented injury. We use data from a large voluntary reporting system of 836,174 medication errors from 1999 to 2005 to provide evidence that the causes and contributing factors of errors that result in harm are similar to the causes and contributing factors of near misses. We develop Bayesian hierarchical models for estimating the log odds of selecting a given cause (or contributing factor) of error given harm has occurred and the log odds of selecting the same cause given that harm did not occur. The posterior distribution of the correlation between these two vectors of log-odds is used as a measure of the evidence supporting the use of data from near misses and their causes and contributing factors to prevent medical errors. In addition, we identify the causes and contributing factors that have the highest or lowest log-odds ratio of harm versus no harm. These causes and contributing factors should also be a focus in the design of prevention strategies. This paper provides important evidence on the utility of data from near misses, which constitute the vast majority of errors in our data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Direction-of-arrival (DOA) estimation is susceptible to errors introduced by the presence of real-ground and resonant size scatterers in the vicinity of the antenna array. To compensate for these errors pre-calibration and auto-calibration techniques are presented. The effects of real-ground constituent parameters on the mutual coupling (MC) of wire type antenna arrays for DOA estimation are investigated. This is accomplished by pre-calibration of the antenna array over the real-ground using the finite element method (FEM). The mutual impedance matrix is pre-estimated and used to remove the perturbations in the received terminal voltage. The unperturbed terminal voltage is incorporated in MUSIC algorithm to estimate DOAs. First, MC of quarter wave monopole antenna arrays is investigated. This work augments an existing MC compensation technique for ground-based antennas and proposes reduction in MC for antennas over finite ground as compared to the perfect ground. A factor of 4 decrease in both the real and imaginary parts of the MC is observed when considering a poor ground versus a perfectly conducting one for quarter wave monopoles in the receiving mode. A simulated result to show the compensation of errors direction of arrival (DOA) estimation with actual realization of the environment is also presented. Secondly, investigations for the effects on received MC of λ/2 dipole arrays placed near real-earth are carried out. As a rule of thumb, estimation of mutual coupling can be divided in two regions of antenna height that is very near ground 0

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Dementia is a multifaceted disorder that impairs cognitive functions, such as memory, language, and executive functions necessary to plan, organize, and prioritize tasks required for goal-directed behaviors. In most cases, individuals with dementia experience difficulties interacting with physical and social environments. The purpose of this study was to establish ecological validity and initial construct validity of a fire evacuation Virtual Reality Day-Out Task (VR-DOT) environment based on performance profiles as a screening tool for early dementia. Objective: The objectives were (1) to examine the relationships among the performances of 3 groups of participants in the VR-DOT and traditional neuropsychological tests employed to assess executive functions, and (2) to compare the performance of participants with mild Alzheimer’s-type dementia (AD) to those with amnestic single-domain mild cognitive impairment (MCI) and healthy controls in the VR-DOT and traditional neuropsychological tests used to assess executive functions. We hypothesized that the 2 cognitively impaired groups would have distinct performance profiles and show significantly impaired independent functioning in ADL compared to the healthy controls. Methods: The study population included 3 groups: 72 healthy control elderly participants, 65 amnestic MCI participants, and 68 mild AD participants. A natural user interface framework based on a fire evacuation VR-DOT environment was used for assessing physical and cognitive abilities of seniors over 3 years. VR-DOT focuses on the subtle errors and patterns in performing everyday activities and has the advantage of not depending on a subjective rating of an individual person. We further assessed functional capacity by both neuropsychological tests (including measures of attention, memory, working memory, executive functions, language, and depression). We also evaluated performance in finger tapping, grip strength, stride length, gait speed, and chair stands separately and while performing VR-DOTs in order to correlate performance in these measures with VR-DOTs because performance while navigating a virtual environment is a valid and reliable indicator of cognitive decline in elderly persons. Results: The mild AD group was more impaired than the amnestic MCI group, and both were more impaired than healthy controls. The novel VR-DOT functional index correlated strongly with standard cognitive and functional measurements, such as mini-mental state examination (MMSE; rho=0.26, P=.01) and Bristol Activities of Daily Living (ADL) scale scores (rho=0.32, P=.001). Conclusions: Functional impairment is a defining characteristic of predementia and is partly dependent on the degree of cognitive impairment. The novel virtual reality measures of functional ability seem more sensitive to functional impairment than qualitative measures in predementia, thus accurately differentiating from healthy controls. We conclude that VR-DOT is an effective tool for discriminating predementia and mild AD from controls by detecting differences in terms of errors, omissions, and perseverations while measuring ADL functional ability.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In situ diffusion experiments are performed in geological formations at underground research laboratories to overcome the limitations of laboratory diffusion experiments and investigate scale effects. Tracer concentrations are monitored at the injection interval during the experiment (dilution data) and measured from host rock samples around the injection interval at the end of the experiment (overcoring data). Diffusion and sorption parameters are derived from the inverse numerical modeling of the measured tracer data. The identifiability and the uncertainties of tritium and Na-22(+) diffusion and sorption parameters are studied here by synthetic experiments having the same characteristics as the in situ diffusion and retention (DR) experiment performed on Opalinus Clay. Contrary to previous identifiability analyses of in situ diffusion experiments, which used either dilution or overcoring data at approximate locations, our analysis of the parameter identifiability relies simultaneously on dilution and overcoring data, accounts for the actual position of the overcoring samples in the claystone, uses realistic values of the standard deviation of the measurement errors, relies on model identification criteria to select the most appropriate hypothesis about the existence of a borehole disturbed zone and addresses the effect of errors in the location of the sampling profiles. The simultaneous use of dilution and overcoring data provides accurate parameter estimates in the presence of measurement errors, allows the identification of the right hypothesis about the borehole disturbed zone and diminishes other model uncertainties such as those caused by errors in the volume of the circulation system and the effective diffusion coefficient of the filter. The proper interpretation of the experiment requires the right hypothesis about the borehole disturbed zone. A wrong assumption leads to large estimation errors. The use of model identification criteria helps in the selection of the best model. Small errors in the depth of the overcoring samples lead to large parameter estimation errors. Therefore, attention should be paid to minimize the errors in positioning the depth of the samples. The results of the identifiability analysis do not depend on the particular realization of random numbers. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

When tilted sideways participants misperceive the visual vertical assessed by means of a luminous line in otherwise complete dark- ness. A recent modeling approach (De Vrijer et al., 2009) claimed that these typical patterns of errors (known as A- and E-effects) could be explained by as- suming that participants behave in a Bayes optimal manner. In this study, we experimentally manipulate participants’ prior information about body-in-space orientation and measure the effect of this manipulation on the subjective visual vertical (SVV). Specifically, we explore the effects of veridical and misleading instructions about body tilt orientations on the SVV. We used a psychophys- ical 2AFC SVV task at roll tilt angles of 0 degrees, 16 degrees and 4 degrees CW and CCW. Participants were tilted to 4 degrees under different instruction conditions: in one condition, participants received veridical instructions as to their tilt angle, whereas in another condition, participants received the mis- leading instruction that their body position was perfectly upright. Our results indicate systematic differences between the instruction conditions at 4 degrees CW and CCW. Participants did not simply use an ego-centric reference frame in the misleading condition; instead, participants’ estimates of the SVV seem to lie between their head’s Z-axis and the estimate of the SVV as measured in the veridical condition. All participants displayed A-effects at roll tilt an- gles of 16 degrees CW and CCW. We discuss our results in the context of the Bayesian model by De Vrijer et al. (2009), and claim that this pattern of re- sults is consistent with a manipulation of precision of a prior distribution over body-in-space orientations. Furthermore, we introduce a Bayesian Generalized Linear Model for estimating parameters of participants’ psychometric function, which allows us to jointly estimate group level and individual level parameters under all experimental conditions simultaneously, rather than relying on the traditional two-step approach to obtaining group level parameter estimates.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE To investigate the likelihood of speaking up about patient safety in oncology and to clarify the effect of clinical and situational context factors on the likelihood of voicing concerns. PATIENTS AND METHODS 1013 nurses and doctors in oncology rated four clinical vignettes describing coworkers' errors and rule violations in a self-administered factorial survey (65% response rate). Multiple regression analysis was used to model the likelihood of speaking up as outcome of vignette attributes, responder's evaluations of the situation and personal characteristics. RESULTS Respondents reported a high likelihood of speaking up about patient safety but the variation between and within types of errors and rule violations was substantial. Staff without managerial function provided significantly higher levels of decision difficulty and discomfort to speak up. Based on the information presented in the vignettes, 74%-96% would speak up towards a supervisor failing to check a prescription, 45%-81% would point a coworker to a missed hand disinfection, 82%-94% would speak up towards nurses who violate a safety rule in medication preparation, and 59%-92% would question a doctor violating a safety rule in lumbar puncture. Several vignette attributes predicted the likelihood of speaking up. Perceived potential harm, anticipated discomfort, and decision difficulty were significant predictors of the likelihood of speaking up. CONCLUSIONS Clinicians' willingness to speak up about patient safety is considerably affected by contextual factors. Physicians and nurses without managerial function report substantial discomfort with speaking up. Oncology departments should provide staff with clear guidance and trainings on when and how to voice safety concerns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Statement of the problem and public health significance. Hospitals were designed to be a safe haven and respite from disease and illness. However, a large body of evidence points to preventable errors in hospitals as the eighth leading cause of death among Americans. Twelve percent of Americans, or over 33.8 million people, are hospitalized each year. This population represents a significant portion of at risk citizens exposed to hospital medical errors. Since the number of annual deaths due to hospital medical errors is estimated to exceed 44,000, the magnitude of this tragedy makes it a significant public health problem. ^ Specific aims. The specific aims of this study were threefold. First, this study aimed to analyze the state of the states' mandatory hospital medical error reporting six years after the release of the influential IOM report, "To Err is Human." The second aim was to identify barriers to reporting of medical errors by hospital personnel. The third aim was to identify hospital safety measures implemented to reduce medical errors and enhance patient safety. ^ Methods. A descriptive, longitudinal, retrospective design was used to address the first stated objective. The study data came from the twenty-one states with mandatory hospital reporting programs which report aggregate hospital error data that is accessible to the public by way of states' websites. The data analysis included calculations of expected number of medical errors for each state according to IOM rates. Where possible, a comparison was made between state reported data and the calculated IOM expected number of errors. A literature review was performed to achieve the second study aim, identifying barriers to reporting medical errors. The final aim was accomplished by telephone interviews of principal patient safety/quality officers from five Texas hospitals with more than 700 beds. ^ Results. The state medical error data suggests vast underreporting of hospital medical errors to the states. The telephone interviews suggest that hospitals are working at reducing medical errors and creating safer environments for patients. The literature review suggests the underreporting of medical errors at the state level stems from underreporting of errors at the delivery level. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The long-term rate of racemization for amino acids preserved in planktonic foraminifera was determined by using independently dated sediment cores from the Arctic Ocean. The racemization rates for aspartic acid (Asp) and glutamic acid (Glu) in the common taxon, Neogloboquadrina pachyderma, were calibrated for the last 150 ka using 14C ages and the emerging Quaternary chronostratigraphy of Arctic Ocean sediments. An analysis of errors indicates realistic age uncertainties of about ±12% for Asp and ±17% for Glu. Fifty individual tests are sufficient to analyze multiple subsamples, identify outliers, and derive robust sample mean values. The new age equation can be applied to verify and refine age models for sediment cores elsewhere in the Arctic Ocean, a critical region for understanding the dynamics of global climate change.