969 resultados para Non-intrusive
Resumo:
The Brix content of pineapple fruit can be non-invasively predicted from the second derivative of near infrared reflectance spectra. Correlations obtained using a NIRSystems 6500 spectrophotometer through multiple linear regression and modified partial least squares analyses using a post-dispersive configuration were comparable with that from a pre-dispersive configuration in terms of accuracy (e.g. coefficient of determination, R2, 0.73; standard error of cross validation, SECV, 1.01°Brix). The effective depth of sample assessed was slightly greater using the post-dispersive technique (about 20 mm for pineapple fruit), as expected in relation to the higher incident light intensity, relative to the pre-dispersive configuration. The effect of such environmental variables as temperature, humidity and external light, and instrumental variables such as the number of scans averaged to form a spectrum, were considered with respect to the accuracy and precision of the measurement of absorbance at 876 nm, as a key term in the calibration for Brix, and predicted Brix. The application of post-dispersive near infrared technology to in-line assessment of intact fruit in a packing shed environment is discussed.
Resumo:
The potential of near infra-red (NIR) spectroscopy for non-invasive measurement of fruit quality of pineapple (Ananas comosus var. Smooth Cayenne) and mango (Magnifera indica var. Kensington) fruit was assessed. A remote reflectance fibre optic probe, placed in contact with the fruit skin surface in a light-proof box, was used to deliver monochromatic light to the fruit, and to collect NIR reflectance spectra (760–2500 nm). The probe illuminated and collected reflected radiation from an area of about 16 cm2. The NIR spectral attributes were correlated with pineapple juice Brix and with mango flesh dry matter (DM) measured from fruit flesh directly underlying the scanned area. The highest correlations for both fruit were found using the second derivative of the spectra (d2 log 1/R) and an additive calibration equation. Multiple linear regression (MLR) on pineapple fruit spectra (n = 85) gave a calibration equation using d2 log 1/R at wavelengths of 866, 760, 1232 and 832 nm with a multiple coefficient of determination (R2) of 0.75, and a standard error of calibration (SEC) of 1.21 °Brix. Modified partial least squares (MPLS) regression analysis yielded a calibration equation with R2 = 0.91, SEC = 0.69, and a standard error of cross validation (SECV) of 1.09 oBrix. For mango, MLR gave a calibration equation using d2 log 1/R at 904, 872, 1660 and 1516 nm with R2 = 0.90, and SEC = 0.85% DM and a bias of 0.39. Using MPLS analysis, a calibration equation with R2 = 0.98, SEC = 0.54 and SECV = 1.19 was obtained. We conclude that NIR technology offers the potential to assess fruit sweetness in intact whole pineapple and DM in mango fruit, respectively, to within 1° Brix and 1% DM, and could be used for the grading of fruit in fruit packing sheds.
Resumo:
The Tasmanian Cancer Registry carried out population-based surveillance of non-melanoma skin cancer (NMSC) from 1978 to 1987. A total of 8,651 NMSC were recorded in 7,160 individuals, representing an age-standardized rate of 161/100,000 per year. Ninety-four percent of cases were based on histological diagnosis. Incidence of basal-cell carcinoma (BCC) was higher than the incidence of squamous-cell carcinoma (SCC). The incidence of NMSC was twice as high in men as in women. Incidence increased substantially with age, more markedly for SCC than BCC. For most body sites, BCC was more frequent, but on highly exposed sites such as the backs of hands, lower limbs in women and ears in men, the incidence of SCC was higher. There was an overall increase of 7% per year in the age-standardized incidence rate of NMSC. The increase was more marked for BCC than for SCC, and was consistent across age groups and both sexes. A first NMSC during the study period was associated with a 12-fold increase among men and a 15-fold increase among women in the risk of development of a new NMSC within 5 years, when compared with the NMSC incidence recorded for the population as a whole.
Resumo:
A modified least mean fourth (LMF) adaptive algorithm applicable to non-stationary signals is presented. The performance of the proposed algorithm is studied by simulation for non-stationarities in bandwidth, centre frequency and gain of a stochastic signal. These non-stationarities are in the form of linear, sinusoidal and jump variations of the parameters. The proposed LMF adaptation is found to have better parameter tracking capability than the LMS adaptation for the same speed of convergence.
Resumo:
- Introduction There is limited understanding of how young adults’ driving behaviour varies according to long-term substance involvement. It is possible that regular users of amphetamine-type stimulants (i.e. ecstasy (MDMA) and methamphetamine) may have a greater predisposition to engage in drink/drug driving compared to non-users. We compare offence rates, and self-reported drink/drug driving rates, for stimulant users and non-users in Queensland, and examine contributing factors. - Methods The Natural History Study of Drug Use is a prospective longitudinal study using population screening to recruit a probabilistic sample of amphetamine-type stimulant users and non-users aged 19-23 years. At the 4 ½ year follow-up, consent was obtained to extract data from participants’ Queensland driver records (ATS users: n=217, non-users: n=135). Prediction models were developed of offence rates in stimulant users controlling for factors such as aggression and delinquency. - Results Stimulant users were more likely than non-users to have had a drink-driving offence (8.7% vs. 0.8%, p < 0.001). Further, about 26% of ATS users and 14% of non-users self-reported driving under the influence of alcohol during the last 12 months. Among stimulant users, drink-driving was independently associated with last month high-volume alcohol consumption (Incident Rate Ratio (IRR): 5.70, 95% CI: 2.24-14.52), depression (IRR: 1.28, 95% CI: 1.07-1.52), low income (IRR: 3.57, 95% CI: 1.12-11.38), and male gender (IRR: 5.40, 95% CI: 2.05-14.21). - Conclusions Amphetamine-type stimulant use is associated with increased long-term risk of drink-driving, due to a number of behavioural and social factors. Inter-sectoral approaches which target long-term behaviours may reduce offending rates.
Models as epistemic artefacts: Toward a non-representationalist account of scientific representation
Resumo:
Analogue and digital techniques for linearization of non-linear input-output relationship of transducers are briefly reviewed. The condition required for linearizing a non-linear function y = f(x) using a non-linear analogue-to-digital converter, is explained. A simple technique to construct a non-linear digital-to-analogue converter, based on ' segments of equal digital interval ' is described. The technique was used to build an N-DAC which can be employed in a successive approximation or counter-ramp type ADC to linearize the non-linear transfer function of a thermistor-resistor combination. The possibility of achieving an order of magnitude higher accuracy in the measurement of temperature is shown.
Resumo:
An analysis has been carried out to study the non-Darcy natural convention flow of Newtonian fluids on a vertical cone embedded in a saturated porous medium with power-law variation of the wall temperature/concentration or heat/mass flux and suction/injection with the streamwise distance x. Both non-similar and self-similar solutions have been obtained. The effects of non-Darcy parameter, ratio of the buoyancy forces due to mass and heat diffusion, variation of wall temperature/concentration or heat/mass flux and suction/injection on the Nusselt and Sherwood numbers have been studied.
Resumo:
Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.
Resumo:
In this study I consider what kind of perspective on the mind body problem is taken and can be taken by a philosophical position called non-reductive physicalism. Many positions fall under this label. The form of non-reductive physicalism which I discuss is in essential respects the position taken by Donald Davidson (1917-2003) and Georg Henrik von Wright (1916-2003). I defend their positions and discuss the unrecognized similarities between their views. Non-reductive physicalism combines two theses: (a) Everything that exists is physical; (b) Mental phenomena cannot be reduced to the states of the brain. This means that according to non-reductive physicalism the mental aspect of humans (be it a soul, mind, or spirit) is an irreducible part of the human condition. Also Davidson and von Wright claim that, in some important sense, the mental aspect of a human being does not reduce to the physical aspect, that there is a gap between these aspects that cannot be closed. I claim that their arguments for this conclusion are convincing. I also argue that whereas von Wright and Davidson give interesting arguments for the irreducibility of the mental, their physicalism is unwarranted. These philosophers do not give good reasons for believing that reality is thoroughly physical. Notwithstanding the materialistic consensus in the contemporary philosophy of mind the ontology of mind is still an uncharted territory where real breakthroughs are not to be expected until a radically new ontological position is developed. The third main claim of this work is that the problem of mental causation cannot be solved from the Davidsonian - von Wrightian perspective. The problem of mental causation is the problem of how mental phenomena like beliefs can cause physical movements of the body. As I see it, the essential point of non-reductive physicalism - the irreducibility of the mental - and the problem of mental causation are closely related. If mental phenomena do not reduce to causally effective states of the brain, then what justifies the belief that mental phenomena have causal powers? If mental causes do not reduce to physical causes, then how to tell when - or whether - the mental causes in terms of which human actions are explained are actually effective? I argue that this - how to decide when mental causes really are effective - is the real problem of mental causation. The motivation to explore and defend a non-reductive position stems from the belief that reductive physicalism leads to serious ethical problems. My claim is that Davidson's and von Wright's ultimate reason to defend a non-reductive view comes back to their belief that a reductive understanding of human nature would be a narrow and possibly harmful perspective. The final conclusion of my thesis is that von Wright's and Davidson's positions provide a starting point from which the current scientistic philosophy of mind can be critically further explored in the future.
Resumo:
This paper presents an approach, based on Lean production philosophy, for rationalising the processes involved in the production of specification documents for construction projects. Current construction literature erroneously depicts the process for the creation of construction specifications as a linear one. This traditional understanding of the specification process often culminates in process-wastes. On the contrary, the evidence suggests that though generalised, the activities involved in producing specification documents are nonlinear. Drawing on the outcome of participant observation, this paper presents an optimised approach for representing construction specifications. Consequently, the actors typically involved in producing specification documents are identified, the processes suitable for automation are highlighted and the central role of tacit knowledge is integrated into a conceptual template of construction specifications. By applying the transformation, flow, value (TFV) theory of Lean production the paper argues that value creation can be realised by eliminating the wastes associated with the traditional preparation of specification documents with a view to integrating specifications in digital models such as Building Information Models (BIM). Therefore, the paper presents an approach for rationalising the TFV theory as a method for optimising current approaches for generating construction specifications based on a revised specification writing model.
Resumo:
The Queensland Shark Control Program (QSCP) aims to protect swimmers at ten beach areas on the east coast of Queensland between Cairns (17°S) and the Gold coast (28°S). Since its inception in 1962 it has deployed shark nets and baited drumlines in a `mixed gear strategy' that adapts the type of gear to the characteristics of a site (e .g . extreme tidal range, high energy wave action, or proximity of turtle breeding areas) . The policy has provided swimmer protection, and the incidental capture of non-target species has been lower than that resulting from deployment of nets alone (Dudley 1997; Gribble et al. 1998b). The QSCP is the only major public-safety shark-control program to routinely use mixed gear. Both the New South Wales (Holt 1998) and KwaZulu-Natal (Dudley 1998) programs use nets exclusively, although the KwaZulu-Natal program has recently tested drumlines on an experimental basis (Dudley 1998; Dudley, personal communication).
Resumo:
Non-thermal plasma (NTP) has been introduced over the last few years as a promising after- treatment system for nitrogen oxides and particulate matter removal from diesel exhaust. NTP technology has not been commercialised as yet, due to its high rate of energy consumption. Therefore, it is important to seek out new methods to improve NTP performance. Residence time is a crucial parameter in engine exhaust emissions treatment. In this paper, different electrode shapes are analysed and the corresponding residence time and NOx removal efficiency are studied. An axisymmetric laminar model is used for obtaining residence time distribution numerically using FLUENT software. If the mean residence time in a NTP plasma reactor increases, there will be a corresponding increase in the reaction time and consequently the pollutant removal efficiency increases. Three different screw thread electrodes and a rod electrode are examined. The results show the advantage of screw thread electrodes in comparison with the rod electrode. Furthermore, between the screw thread electrodes, the electrode with the thread width of 1 mm has the highest NOx removal due to higher residence time and a greater number of micro-discharges. The results show that the residence time of the screw thread electrode with a thread width of 1 mm is 21% more than for the rod electrode.
Resumo:
Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.