848 resultados para Errors and omission
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This study aimed to assess measurements of temperature and relative humidity obtained with HOBO a data logger, under various conditions of exposure to solar radiation, comparing them with those obtained through the use of a temperature/relative humidity probe and a copper-constantan thermocouple psychrometer, which are considered the standards for obtaining such measurements. Data were collected over a 6-day period (from 25 March to 1 April, 2010), during which the equipment was monitored continuously and simultaneously. We employed the following combinations of equipment and conditions: a HOBO data logger in full sunlight; a HOBO data logger shielded within a white plastic cup with windows for air circulation; a HOBO data logger shielded within a gill-type shelter (multi-plate prototype plastic); a copper-constantan thermocouple psychrometer exposed to natural ventilation and protected from sunlight; and a temperature/relative humidity probe under a commercial, multi-plate radiation shield. Comparisons between the measurements obtained with the various devices were made on the basis of statistical indicators: linear regression, with coefficient of determination; index of agreement; maximum absolute error; and mean absolute error. The prototype multi-plate shelter (gill-type) used in order to protect the HOBO data logger was found to provide the best protection against the effects of solar radiation on measurements of temperature and relative humidity. The precision and accuracy of a device that measures temperature and relative humidity depend on an efficient shelter that minimizes the interference caused by solar radiation, thereby avoiding erroneous analysis of the data obtained.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Assessment of the suitability of anthropogenic landscapes for wildlife species is crucial for setting priorities for biodiversity conservation. This study aimed to analyse the environmental suitability of a highly fragmented region of the Brazilian Atlantic Forest, one of the world's 25 recognized biodiversity hotspots, for forest bird species. Eight forest bird species were selected for the analyses, based on point counts (n = 122) conducted in April-September 2006 and January-March 2009. Six additional variables (landscape diversity, distance from forest and streams, aspect, elevation and slope) were modelled in Maxent for (1) actual and (2) simulated land cover, based on the forest expansion required by existing Brazilian forest legislation. Models were evaluated by bootstrap or jackknife methods and their performance was assessed by AUC, omission error, binomial probability or p value. All predictive models were statistically significant, with high AUC values and low omission errors. A small proportion of the actual landscape (24.41 +/- 6.31%) was suitable for forest bird species. The simulated landscapes lead to an increase of c. 30% in total suitable areas. In average, models predicted a small increase (23.69 +/- 6.95%) in the area of suitable native forest for bird species. Being close to forest increased the environmental suitability of landscapes for all bird species; landscape diversity was also a significant factor for some species. In conclusion, this study demonstrates that species distribution modelling (SDM) successfully predicted bird distribution across a heterogeneous landscape at fine spatial resolution, as all models were biologically relevant and statistically significant. The use of landscape variables as predictors contributed significantly to the results, particularly for species distributions over small extents and at fine scales. This is the first study to evaluate the environmental suitability of the remaining Brazilian Atlantic Forest for bird species in an agricultural landscape, and provides important additional data for regional environmental planning.
Resumo:
A recent review of the homology concept in cladistics is critiqued in light of the historical literature. Homology as a notion relevant to the recognition of clades remains equivalent to synapomorphy. Some symplesiomorphies are homologies inasmuch as they represent synapomorphies of more inclusive taxa; others are complementary character states that do not imply any shared evolutionary history among the taxa that exhibit the state. Undirected character-state change (as characters optimized on an unrooted tree) is a necessary but not sufficient test of homology, because the addition of a root may alter parsimonious reconstructions. Primary and secondary homology are defended as realistic representations of discovery procedures in comparative biology, recognizable even in Direct Optimization. The epistemological relationship between homology as evidence and common ancestry as explanation is again emphasized. An alternative definition of homology is proposed. (c) The Willi Hennig Society 2012.
Resumo:
Robust analysis of vector fields has been established as an important tool for deriving insights from the complex systems these fields model. Traditional analysis and visualization techniques rely primarily on computing streamlines through numerical integration. The inherent numerical errors of such approaches are usually ignored, leading to inconsistencies that cause unreliable visualizations and can ultimately prevent in-depth analysis. We propose a new representation for vector fields on surfaces that replaces numerical integration through triangles with maps from the triangle boundaries to themselves. This representation, called edge maps, permits a concise description of flow behaviors and is equivalent to computing all possible streamlines at a user defined error threshold. Independent of this error streamlines computed using edge maps are guaranteed to be consistent up to floating point precision, enabling the stable extraction of features such as the topological skeleton. Furthermore, our representation explicitly stores spatial and temporal errors which we use to produce more informative visualizations. This work describes the construction of edge maps, the error quantification, and a refinement procedure to adhere to a user defined error bound. Finally, we introduce new visualizations using the additional information provided by edge maps to indicate the uncertainty involved in computing streamlines and topological structures.
Resumo:
Evidence from appetitive Pavlovian and instrumental conditioning studies suggest that the amygdala is involved in modulation of responses correlated with motivational states, and therefore, to the modulation of processes probably underlying reinforcement omission effects. The present study aimed to clarify whether or not the mechanisms related to reinforcement omission effects of different magnitudes depend on basolateral complex and central nucleus of amygdala. Rats were trained on a fixed-interval 12 s with limited hold 6 s signaled schedule in which correct responses were always followed by one of two reinforcement magnitudes. Bilateral lesions of the basolateral complex and central nucleus were made after acquisition of stable performance. After postoperative recovery, the training was changed from 100% to 50% reinforcement schedules. The results showed that lesions of the basolateral complex and central nucleus did not eliminate or reduce, but interfere with reinforcement omission effects. The response from rats of both the basolateral complex and central nucleus lesioned group was higher relative to that of the rats of their respective sham-lesioned groups after reinforcement omission. Thus, the lesioned rats were more sensitive to the omission effect. Moreover, the basolateral complex lesions prevented the magnitude effect on reinforcement omission effects. Basolateral complex lesioned rats showed no differential performance following omission of larger and smaller reinforcement magnitude. Thus, the basolateral complex is involved in incentive processes relative to omission of different reinforcement magnitudes. Therefore, it is possible that reinforcement omission effects are modulated by brain circuitry which involves amygdala. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Nanoindentation is a valuable tool for characterization of biomaterials due to its ability to measure local properties in heterogeneous, small or irregularly shaped samples. However, applying nanoindentation to compliant, hydrated biomaterials leads to many challenges including adhesion between the nanoindenter tip and the sample. Although adhesion leads to overestimation of the modulus of compliant samples when analyzing nanoindentation data using traditional analysis techniques, most studies of biomaterials have ignored its effects. This paper demonstrates two methods for managing adhesion in nanoindentation analysis, the nano-JKR force curve method and the surfactant method, through application to two biomedically-relevant compliant materials, poly(dimethyl siloxane) (PDMS) elastomers and poly(ethylene glycol) (PEG) hydrogels. The nano-JKR force curve method accounts for adhesion during data analysis using equations based on the Johnson-Kendall-Roberts (JKR) adhesion model, while the surfactant method eliminates adhesion during data collection, allowing data analysis using traditional techniques. In this study, indents performed in air or water resulted in adhesion between the tip and the sample, while testing the same materials submerged in Optifree Express() contact lens solution eliminated tip-sample adhesion in most samples. Modulus values from the two methods were within 7% of each other, despite different hydration conditions and evidence of adhesion. Using surfactant also did not significantly alter the properties of the tested material, allowed accurate modulus measurements using commercial software, and facilitated nanoindentation testing in fluids. This technique shows promise for more accurate and faster determination of modulus values from nanoindentation of compliant, hydrated biological samples. Copyright 2013 Elsevier Ltd. All rights reserved.
Resumo:
OBJECTIVES: To analyse the frequency of and identify risk factors for patient-reported medical errors in Switzerland. The joint effect of risk factors on error-reporting probability was modelled for hypothetical patients. METHODS: A representative population sample of Swiss citizens (n = 1306) was surveyed as part of the Commonwealth Fund’s 2010 lnternational Survey of the General Public’s Views of their Health Care System’s Performance in Eleven Countries. Data on personal background, utilisation of health care, coordination of care problems and reported errors were assessed. Logistic regression analysis was conducted to identify risk factors for patients’ reports of medical mistakes and medication errors. RESULTS: 11.4% of participants reported at least one error in their care in the previous two years (8% medical errors, 5.3% medication errors). Poor coordination of care experiences was frequent. 7.8% experienced that test results or medical records were not available, 17.2% received conflicting information from care providers and 11.5% reported that tests were ordered although they had been done before. Age (OR = 0.98, p = 0.014), poor health (OR = 2.95, p = 0.007), utilisation of emergency care (OR = 2.45, p = 0.003), inpatient-stay (OR = 2.31, p = 0.010) and poor care coordination (OR = 5.43, p <0.001) are important predictors for reporting error. For high utilisers of care that unify multiple risk factors the probability that errors are reported rises up to p = 0.8. CONCLUSIONS: Patient safety remains a major challenge for the Swiss health care system. Despite the health related and economic burden associated with it, the widespread experience of medical error in some subpopulations also has the potential to erode trust in the health care system as a whole.
Resumo:
Using a weighted up-down procedure, in each of eight conditions 28 participants compared durations of auditory (noise bursts) or visual (LED flashes) intervals; filled or unfilled with 3-ms markers; with or without feedback. Standards (Sts) were 100 and 1000 ms, and the ISI 900 ms. Intermixedly, presentation orders were St-Comparison (Co) and Co-St. TOEs were positive for St=100-ms and negative for St=1000 ms. Weber fractions (WFs, JND/St) were lowered by feedback. For visual-filled and visual-empty, WFs were highest for St=100 ms. For auditory-filled and visual-empty, St interacted with Order: lowest WFs occurred for St-Co with St=1000 ms, but for Co-St with St=100 ms. Lowest average WFs occurred with St-Co for visual-filled, but with Co-St for visual-empty. The results refute the generalization of better discrimination with St-Co than with Co-St (”type-B effect”), and support the notion of sensation weighting: flexibly differential impact weights of the compared durations in generating the response.
Resumo:
The COSMIC-2 mission is a follow-on mission of the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) with an upgraded payload for improved radio occultation (RO) applications. The objective of this paper is to develop a near-real-time (NRT) orbit determination system, called NRT National Chiao Tung University (NCTU) system, to support COSMIC-2 in atmospheric applications and verify the orbit product of COSMIC. The system is capable of automatic determinations of the NRT GPS clocks and LEO orbit and clock. To assess the NRT (NCTU) system, we use eight days of COSMIC data (March 24-31, 2011), which contain a total of 331 GPS observation sessions and 12 393 RO observable files. The parallel scheduling for independent GPS and LEO estimations and automatic time matching improves the computational efficiency by 64% compared to the sequential scheduling. Orbit difference analyses suggest a 10-cm accuracy for the COSMIC orbits from the NRT (NCTU) system, and it is consistent as the NRT University Corporation for Atmospheric Research (URCA) system. The mean velocity accuracy from the NRT orbits of COSMIC is 0.168 mm/s, corresponding to an error of about 0.051 μrad in the bending angle. The rms differences in the NRT COSMIC clock and in GPS clocks between the NRT (NCTU) and the postprocessing products are 3.742 and 1.427 ns. The GPS clocks determined from a partial ground GPS network [from NRT (NCTU)] and a full one [from NRT (UCAR)] result in mean rms frequency stabilities of 6.1E-12 and 2.7E-12, respectively, corresponding to range fluctuations of 5.5 and 2.4 cm and bending angle errors of 3.75 and 1.66 μrad .
Resumo:
In many field or laboratory situations, well-mixed reservoirs like, for instance, injection or detection wells and gas distribution or sampling chambers define boundaries of transport domains. Exchange of solutes or gases across such boundaries can occur through advective or diffusive processes. First we analyzed situations, where the inlet region consists of a well-mixed reservoir, in a systematic way by interpreting them in terms of injection type. Second, we discussed the mass balance errors that seem to appear in case of resident injections. Mixing cells (MC) can be coupled mathematically in different ways to a domain where advective-dispersive transport occurs: by assuming a continuous solute flux at the interface (flux injection, MC-FI), or by assuming a continuous resident concentration (resident injection). In the latter case, the flux leaving the mixing cell can be defined in two ways: either as the value when the interface is approached from the mixing-cell side (MC-RT -), or as the value when it is approached from the column side (MC-RT +). Solutions of these injection types with constant or-in one case-distance-dependent transport parameters were compared to each other as well as to a solution of a two-layer system, where the first layer was characterized by a large dispersion coefficient. These solutions differ mainly at small Peclet numbers. For most real situations, the model for resident injection MC-RI + is considered to be relevant. This type of injection was modeled with a constant or with an exponentially varying dispersion coefficient within the porous medium. A constant dispersion coefficient will be appropriate for gases because of the Eulerian nature of the usually dominating gaseous diffusion coefficient, whereas the asymptotically growing dispersion coefficient will be more appropriate for solutes due to the Lagrangian nature of mechanical dispersion, which evolves only with the fluid flow. Assuming a continuous resident concentration at the interface between a mixing cell and a column, as in case of the MC-RI + model, entails a flux discontinuity. This flux discontinuity arises inherently from the definition of a mixing cell: the mixing process is included in the balance equation, but does not appear in the description of the flux through the mixing cell. There, only convection appears because of the homogeneous concentration within the mixing cell. Thus, the solute flux through a mixing cell in close contact with a transport domain is generally underestimated. This leads to (apparent) mass balance errors, which are often reported for similar situations and erroneously used to judge the validity of such models. Finally, the mixing cell model MC-RI + defines a universal basis regarding the type of solute injection at a boundary. Depending on the mixing cell parameters, it represents, in its limits, flux as well as resident injections. (C) 1998 Elsevier Science B.V. All rights reserved.