968 resultados para Implicit calibration
Resumo:
Leaf water potential (psi (l)) represents a good indicator of the water status of plants, and continuous monitoring of it can be useful in research and field applications such as scheduling irrigation. Changes in stem diameter (Sd) were used for monitoring psi (l) of pot-grown sorghum [Sorghum bicolor (L.) Moench] plants in a glasshouse. This method requires occasional calibration of S-d values against psi (l). Predicted values of psi (l), based on a single calibration show a good correlation with measured psi (l), values over a period of 13 d before and after the calibration. The correlation can further be improved with shorter time intervals.
Resumo:
In this paper, an attempt was made to investigate a fundamental problem related to the flexural waves excited by rectangular transducers. Due to the disadvantages of the Green's function approach for solving this problem, a direct and effective method is proposed using a multiple integral transform method and contour integration technique. The explicit frequency domain solutions obtained from this newly developed method are convenient for understanding transducer behavior and theoretical optimization and experimental calibration of rectangular transducers. The time domain solutions can then be easily obtained by using the fast Fourier transform technique. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
Expansion tubes operating at total flow enthalpies of 100 MJ kg(-1) or more have characteristical test times of 30-50 mus. Under these conditions, the response time of the Pitot pressure measuring device is critical when performing flow calibration studies. The conventional technique of using a commercial pressure transducer protected by shielding has not always proven to be effective, due to the relatively large (and variable) response time caused by the shielding. A device called the stress wave bar gauge has been designed and calibrated and shown to be an effective way to measure the Pitot pressure with a response time of only 2-3 mus.
Resumo:
Pasminco Century Mine has developed a geophysical logging system to provide new data for ore mining/grade control and the generation of Short Term Models for mine planning. Previous work indicated the applicability of petrophysical logging for lithology prediction, however, the automation of the method was not considered reliable enough for the development of a mining model. A test survey was undertaken using two diamond drilled control holes and eight percussion holes. All holes were logged with natural gamma, magnetic susceptibility and density. Calibration of the LogTrans auto-interpretation software using only natural gamma and magnetic susceptibility indicated that both lithology and stratigraphy could be predicted. Development of a capability to enforce stratigraphic order within LogTrans increased the reliability and accuracy of interpretations. After the completion of a feasibility program, Century Mine has invested in a dedicated logging vehicle to log blast holes as well as for use in in-fill drilling programs. Future refinement of the system may lead to the development of GPS controlled excavators for mining ore.
Resumo:
This paper describes a rainfall simulator developed for field and laboratory studies that gives great flexibility in plot size covered, that is highly portable and able to be used on steep slopes, and that is economical in its water use. The simulator uses Veejet 80100 nozzles mounted on a manifold, with the nozzles controlled to sweep to and from across a plot width of 1.5 m. Effective rainfall intensity is controlled by the frequency with which the nozzles sweep. Spatial uniformity of rainfall on the plots is high, with coefficients of variation (CV) on the body of the plot being 8-10%. Use of the simulator for erosion and infiltration measurements is discussed.
Resumo:
The IWA Anaerobic Digestion Modelling Task Group was established in 1997 at the 8th World Congress on,Anaerobic Digestion (Sendai, Japan) with the goal of developing a generalised anaerobic digestion model. The structured model includes multiple steps describing biochemical as well as physicochemical processes. The biochemical steps include disintegration from homogeneous particulates to carbohydrates, proteins and lipids; extracellular hydrolysis of these particulate substrates to sugars, amino acids, and long chain fatty acids (LCFA), respectively; acidogenesis from sugars and amino acids to volatile fatty acids (VFAs) and hydrogen; acetogenesis of LCFA and VFAs to acetate; and separate methanogenesis steps from acetate and hydrogen/CO2. The physico-chemical equations describe ion association and dissociation, and gas-liquid transfer. Implemented as a differential and algebraic equation (DAE) set, there are 26 dynamic state concentration variables, and 8 implicit algebraic variables per reactor vessel or element. Implemented as differential equations (DE) only, there are 32 dynamic concentration state variables.
Resumo:
This paper discusses the design and characterisation of a short, and hence portable impact load cell for in-situ quantification of ore breakage properties under impact loading conditions. Much literature has been published in the past two decades about impact load cells for ore breakage testing. It has been conclusively shown that such machines yield significant quantitative energy-fragmentation information about industrial ores. However, documented load cells are all laboratory systems that are not adapted for in-situ testing due to their dimensions and operating requirements. The authors report on a new portable impact load cell designed specifically for in-situ testing. The load cell is 1.5 m in height and weighs 30 kg. Its physical and operating characteristics are detailed in the paper. This includes physical dimensions, calibration and signal deconvolution. Emphasis is placed on the deconvolution issue, which is significant for such a short load cell. Finally, it is conclusively shown that the short load cell is quantitatively as accurate as its larger laboratory analogues. (C) 2062 Elsevier Science B.V. All rights reserved.
Resumo:
Background Diagnosis of the HIV-associated lipodystrophy syndrome is based on clinical assessment, in lack of a consensus about case definition and reference methods. Three bedside methods were compared in their diagnostic value for lipodystrophy. Patients and Methods. Consecutive HIV-infected outpatients (n = 278) were investigated, 128 of which also had data from 1997 available. Segmental bioelectrical impedance analysis (BIA) and waist, hip and thigh circumferences were performed. Changes in seven body regions were rated by physicians and patients using linear analogue scale assessment (LASA). Diagnostic cut-off values were searched by receiver operator characteristics. Results. Lipodystrophy was diagnosed in 85 patients (31%). BIA demonstrated higher fat-free mass in patients with lipodystrophy but not after controlling for body mass index and sex. Segmental BIA was not superior to whole body BIA in detecting lipodystrophy. Fat-free mass increased from 1997 to 1999 independent from lipodystrophy. Waist-hip and waist-thigh ratios were higher in patients with lipodystrophy. BIA, anthropometry and LASA did not provide sufficient diagnostic cut-off values for lipodystrophy. Agreement between methods, and between patient and physician rating, was poor. Conclusion: These methods do not fulfil the urgent need for quantitative diagnostic tools for lipodystrophy. BIA estimates of fat free mass may be biased by lipodystrophy, indicating a need for re-calibration in HIV infected populations. (C) 2001 Harcourt Publishers Ltd.
Resumo:
Two studies investigated interactions between health providers and patients, using Semin and Fiedler's linguistic category model. In Study 1 the linguistic category model was used to examine perceptions of the levels of linguistic intergroup bias in descriptions of conversations with health professionals in hospitals. Results indicated a favourable linguistic bias toward health professionals in satisfactory conversations but low levels of linguistic intergroup bias in unsatisfactory conversations. In Study 2, the language of patients and health professionals in videotaped interactions was examined for levels of linguistic intergroup bias. Interpersonally salient interactions showed less linguistic intergroup bias than did intergroup ones. Results also indicate that health professionals have high levels of control in all types of medical encounters with patients. Nevertheless, the extent to which patients are able to interact with health professionals as individuals, rather than only as professionals is a key determinant of satisfaction with the interaction.
Resumo:
As marketers and researchers we understand quality from the consumer's perspective, and throughout contemporary service quality literature there is an emphasis on what the consumer is looking for, or at least that is the intention. Through examining the underlying assumptions of dominant service quality theories, an implicit dualistic ontology is highlighted (where subject and object are considered independent) and argued to effectively negate the said necessary consumer orientation. This fundamental assumption is discussed, as are the implications, following a critical review of dominant service quality models. Consequently, we propose an alternative approach to service quality research that aims towards a more genuine understanding of the consumer's perspective on quality experienced within a service context. Essentially, contemporary service quality research is suggested to be limited in its inherent third-person perspective and the interpretive, specifically phenomenographic, approach put forward here is suggested as a means of achieving a first-person perspective on service quality.
Resumo:
An acceleration compensated transducer was developed to enable the direct measurement of skin friction in hypervelocity impulse facilities. The gauge incorporated a measurement and acceleration element that employed direct shear of a piezoelectric ceramic. The design integrated techniques to maximize rise time and shear response while minimizing the affects of acceleration, pressure, heat transfer, and electrical interference. The arrangement resulted in a transducer natural frequency near 40 kHz. The transducer was calibrated for shear and acceleration in separate bench tests and was calibrated for pressure within an impulse facility. Uncertainty analyses identified only small experimental errors in the shear and acceleration calibration techniques. Although significant errors were revealed in the method of pressure calibration, total skin-friction measurement errors as low as +/-7-12% were established. The transducer was successfully utilized in a shock tunnel, and sample measurements are presented for flow conditions that simulate a flight Mach number near 8.
Resumo:
As a component of archaeological investigations on the central Queensland coast, a series of five marine shell specimens live-collected between A.D. 1904 and A.D. 1929 and 11 shell/ charcoal paired samples from archaeological contexts were radiocarbon dated to determine local DeltaR values. The object of the study was to assess the potential influence of localized variation in marine reservoir effect in accurately determining the age of marine and estuarine shell from archaeological deposits in the area. Results indicate that the routinely applied DeltaR value of -5 +/- 35 for northeast Australia is erroneously calculated. The determined values suggest a minor revision to Reimer and Reimer's (2000) recommended value for northeast Australia from DeltaR = +11 +/- 5 to + 12 +/- 7, and specifically for central Queensland to DeltaR = +10 +/- 7, for near-shore open marine environments. In contrast, data obtained from estuarine shell/charcoal pairs demonstrate a general lack of consistency, suggesting estuary-specific patterns of variation in terrestrial carbon input and exchange with the open ocean. Preliminary data indicate that in some estuaries, at some time periods, a DeltaR value of more than - 155 +/- 55 may be appropriate, In estuarine contexts in central Queensland, a localized estuary-specific correction factor is recommended to account for geographical and temporal variation in C-14 activity. (C) 2002 Wiley Periodicals.
Resumo:
In this paper we construct predictor-corrector (PC) methods based on the trivial predictor and stochastic implicit Runge-Kutta (RK) correctors for solving stochastic differential equations. Using the colored rooted tree theory and stochastic B-series, the order condition theorem is derived for constructing stochastic RK methods based on PC implementations. We also present detailed order conditions of the PC methods using stochastic implicit RK correctors with strong global order 1.0 and 1.5. A two-stage implicit RK method with strong global order 1.0 and a four-stage implicit RK method with strong global order 1.5 used as the correctors are constructed in this paper. The mean-square stability properties and numerical results of the PC methods based on these two implicit RK correctors are reported.
Resumo:
A central problem in visual perception concerns how humans perceive stable and uniform object colors despite variable lighting conditions (i.e. color constancy). One solution is to 'discount' variations in lighting across object surfaces by encoding color contrasts, and utilize this information to 'fill in' properties of the entire object surface. Implicit in this solution is the caveat that the color contrasts defining object boundaries must be distinguished from the spurious color fringes that occur naturally along luminance-defined edges in the retinal image (i.e. optical chromatic aberration). In the present paper, we propose that the neural machinery underlying color constancy is complemented by an 'error-correction' procedure which compensates for chromatic aberration, and suggest that error-correction may be linked functionally to the experimentally induced illusory colored aftereffects known as McCollough effects (MEs). To test these proposals, we develop a neural network model which incorporates many of the receptive-field (RF) profiles of neurons in primate color vision. The model is composed of two parallel processing streams which encode complementary sets of stimulus features: one stream encodes color contrasts to facilitate filling-in and color constancy; the other stream selectively encodes (spurious) color fringes at luminance boundaries, and learns to inhibit the filling-in of these colors within the first stream. Computer simulations of the model illustrate how complementary color-spatial interactions between error-correction and filling-in operations (a) facilitate color constancy, (b) reveal functional links between color constancy and the ME, and (c) reconcile previously reported anomalies in the local (edge) and global (spreading) properties of the ME. We discuss the broader implications of these findings by considering the complementary functional roles performed by RFs mediating color-spatial interactions in the primate visual system. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The notion of implicature was first introduced by Paul Grice (1967, 1989), who defined it essentially as what is communicated less what is said. This definition contributed in part to the proliferation of a large number of different species of implicature by neo-Griceans. Relevance theorists have responded to this by proposing a shift back to the distinction between "explicit" & "implicit" meaning (corresponding to "explicature" & "implicature," respectively). However, they appear to have pared down the concept of implicature too much, ignoring phenomena that may be better treated as implicatures in their overgeneralization of the concept of explicature. These problems have their roots in the fact that explicit & implicit meaning intuitively overlap & thus do not provide a suitable basis for distinguishing implicature from other types of pragmatic phenomena. An alternative conceptualization of implicature based on the concept of "implying" with which Grice originally associated his notion of implicature is thus proposed. From this definition, it emerges that implicature constitutes something else inferred by the addressee that is not literally said by the speaker. Instead, it is meant in addition to what the speaker literally says & is consequently defeasible like all other types of pragmatic phenomena. 1 Figure, 60 References. Adapted from the source document