853 resultados para on-line
Resumo:
This paper addresses the problem of tracking line segments corresponding to on-line handwritten obtained through a digitizer tablet. The approach is based on Kalman filtering to model linear portions of on-line handwritten, particularly, handwritten numerals, and to detect abrupt changes in handwritten direction underlying a model change. This approach uses a Kalman filter framework constrained by a normalized line equation, where quadratic terms are linearized through a first-order Taylor expansion. The modeling is then carried out under the assumption that the state is deterministic and time-invariant, while the detection relies on double thresholding mechanism which tests for a violation of this assumption. The first threshold is based on an approach of layout kinetics. The second one takes into account the jump in angle between the past observed direction of layout and its current direction. The method proposed enables real-time processing. To illustrate the methodology proposed, some results obtained from handwritten numerals are presented.
Resumo:
The cheese industry has continually sought a robust method to monitor milk coagulation. Measurement of whey separation is also critical to control cheese moisture content, which affects quality. The objective of this study was to demonstrate that an online optical sensor detecting light backscatter in a vat could be applied to monitor both coagulation and syneresis during cheesemaking. A prototype sensor having a large field of view (LFV) relative to curd particle size was constructed. Temperature, cutting time, and calcium chloride addition were varied to evaluate the response of the sensor over a wide range of coagulation and syneresis rates. The LFV sensor response was related to casein micelle aggregation and curd firming during coagulation and to changes in curd moisture and whey fat contents during syneresis. The LFV sensor has potential as an online, continuous sensor technology for monitoring both coagulation and syneresis during cheesemaking.
Resumo:
The objective of this study was to investigate a novel light backscatter sensor, with a large field of view relative to curd size, for continuous on-line monitoring of coagulation and syneresis to improve curd moisture content control. A three-level, central composite design was employed to study the effects of temperature, cutting time, and CaCl2 addition on cheese making parameters. The sensor signal was recorded and analyzed. The light backscatter ratio followed a sigmoid increase during coagulation and decreased asymptotically after gel cutting. Curd yield and curd moisture content were predicted from the time to the maximum slope of the first derivative of the light backscatter ratio during coagulation and the decrease in the sensor response during syneresis. Whey fat was affected by coagulation kinetics and cutting time, suggesting curd rheological properties at cutting are dominant factors determining fat losses. The proposed technology shows potential for on-line monitoring of coagulation and syneresis. 2007 Elsevier Ltd. All rights reserved..
Resumo:
The potential of a fibre optic sensor, detecting light backscatter in a cheese vat during coagulation and syneresis, to predict curd moisture, fat loses and curd yield was examined. Temperature, cutting time and calcium levels were varied to assess the strength of the predictions over a range of processing conditions. Equations were developed using a combination of independent variables, milk compositional and light backscatter parameters. Fat losses, curd yield and curd moisture content were predicted with a standard error of prediction (SEP) of +/- 2.65 g 100 g(-1) (R-2 = 0.93), +/- 0.95% (R-2 = 0.90) and +/- 1.43% (R-2 = 0.94), respectively. These results were used to develop a model for predicting curd moisture as a function of time during syneresis (SEP = +/- 1.72%; R-2 = 0.95). By monitoring coagulation and syneresis, this sensor technology could be employed to control curd moisture content, thereby improving process control during cheese manufacture. (c) 2007 Elsevier Ltd. All rights reserved..
Resumo:
Measuring the retention, or residence time, of dosage forms to biological tissue is commonly a qualitative measurement, where no real values to describe the retention can be recorded. The result of this is an assessment that is dependent upon a user's interpretation of visual observation. This research paper outlines the development of a methodology to quantitatively measure, both by image analysis and by spectrophotometric techniques, the retention of material to biological tissues, using the retention of polymer solutions to ocular tissue as an example. Both methods have been shown to be repeatable, with the spectrophotometric measurement generating data reliably and quickly for further analysis.
Resumo:
We investigated the on-line processing of unaccusative and unergative sentences in a group of eight Greek-speaking individuals diagnosed with Broca aphasia and a group of language-unimpaired subjects used as the baseline. The processing of unaccusativity refers to the reactivation of the postverbal trace by retrieving the mnemonic representation of the verb’s syntactically defined antecedent provided in the early part of the sentence. Our results demonstrate that the Broca group showed selective reactivation of the antecedent for the unaccusatives. We consider several interpretations for our data, including explanations focusing on the transitivization properties of nonactive and active voice-alternating unaccusatives, the costly procedure claimed to underlie the parsing of active nonvoice-alternating unaccusatives, and the animacy of the antecedent modulating the syntactic choices of the patients.
Resumo:
This paper introduces a new adaptive nonlinear equalizer relying on a radial basis function (RBF) model, which is designed based on the minimum bit error rate (MBER) criterion, in the system setting of the intersymbol interference channel plus a co-channel interference. Our proposed algorithm is referred to as the on-line mixture of Gaussians estimator aided MBER (OMG-MBER) equalizer. Specifically, a mixture of Gaussians based probability density function (PDF) estimator is used to model the PDF of the decision variable, for which a novel on-line PDF update algorithm is derived to track the incoming data. With the aid of this novel on-line mixture of Gaussians based sample-by-sample updated PDF estimator, our adaptive nonlinear equalizer is capable of updating its equalizer’s parameters sample by sample to aim directly at minimizing the RBF nonlinear equalizer’s achievable bit error rate (BER). The proposed OMG-MBER equalizer significantly outperforms the existing on-line nonlinear MBER equalizer, known as the least bit error rate equalizer, in terms of both the convergence speed and the achievable BER, as is confirmed in our simulation study
Resumo:
The present study compared production and on-line comprehension of definite articles and third person direct object clitic pronouns in Greek-speaking typically developing, sequential bilingual (L2-TD) children and monolingual children with specific language impairment (L1-SLI). Twenty Turkish Greek L2-TD children, 16 Greek L1-SLI children, and 31 L1-TD Greek children participated in a production task examining definite articles and clitic pronouns and, in an on-line comprehension task, involving grammatical sentences with definite articles and clitics and sentences with grammatical violations induced by omitted articles and clitics. The results showed that the L2-TD children were sensitive to the grammatical violations despite low production. In contrast, the children with SLI were not sensitive to clitic omission in the on-line task, despite high production. These results support a dissociation between production and on-line comprehension in L2 children and for impaired grammatical representations and lack of automaticity in children with SLI. They also suggest that on-line comprehension tasks may complement production tasks by differentiating between the language profiles of L2-TD children and children with SLI.
Resumo:
We report on the first realtime ionospheric predictions network and its capabilities to ingest a global database and forecast F-layer characteristics and "in situ" electron densities along the track of an orbiting spacecraft. A global network of ionosonde stations reported around-the-clock observations of F-region heights and densities, and an on-line library of models provided forecasting capabilities. Each model was tested against the incoming data; relative accuracies were intercompared to determine the best overall fit to the prevailing conditions; and the best-fit model was used to predict ionospheric conditions on an orbit-to-orbit basis for the 12-hour period following a twice-daily model test and validation procedure. It was found that the best-fit model often provided averaged (i.e., climatologically-based) accuracies better than 5% in predicting the heights and critical frequencies of the F-region peaks in the latitudinal domain of the TSS-1R flight path. There was a sharp contrast however, in model-measurement comparisons involving predictions of actual, unaveraged, along-track densities at the 295 km orbital altitude of TSS-1R In this case, extrema in the first-principle models varied by as much as an order of magnitude in density predictions, and the best-fit models were found to disagree with the "in situ" observations of Ne by as much as 140%. The discrepancies are interpreted as a manifestation of difficulties in accurately and self-consistently modeling the external controls of solar and magnetospheric inputs and the spatial and temporal variabilities in electric fields, thermospheric winds, plasmaspheric fluxes, and chemistry.
On-line Gaussian mixture density estimator for adaptive minimum bit-error-rate beamforming receivers
Resumo:
We develop an on-line Gaussian mixture density estimator (OGMDE) in the complex-valued domain to facilitate adaptive minimum bit-error-rate (MBER) beamforming receiver for multiple antenna based space-division multiple access systems. Specifically, the novel OGMDE is proposed to adaptively model the probability density function of the beamformer’s output by tracking the incoming data sample by sample. With the aid of the proposed OGMDE, our adaptive beamformer is capable of updating the beamformer’s weights sample by sample to directly minimize the achievable bit error rate (BER). We show that this OGMDE based MBER beamformer outperforms the existing on-line MBER beamformer, known as the least BER beamformer, in terms of both the convergence speed and the achievable BER.
Resumo:
A new online method to analyse water isotopes of speleothem fluid inclusions using a wavelength scanned cavity ring down spectroscopy (WS-CRDS) instrument is presented. This novel technique allows us simultaneously to measure hydrogen and oxygen isotopes for a released aliquot of water. To do so, we designed a new simple line that allows the online water extraction and isotope analysis of speleothem samples. The specificity of the method lies in the fact that fluid inclusions release is made on a standard water background, which mainly improves the δ D robustness. To saturate the line, a peristaltic pump continuously injects standard water into the line that is permanently heated to 140 °C and flushed with dry nitrogen gas. This permits instantaneous and complete vaporisation of the standard water, resulting in an artificial water background with well-known δ D and δ18O values. The speleothem sample is placed in a copper tube, attached to the line, and after system stabilisation it is crushed using a simple hydraulic device to liberate speleothem fluid inclusions water. The released water is carried by the nitrogen/standard water gas stream directly to a Picarro L1102-i for isotope determination. To test the accuracy and reproducibility of the line and to measure standard water during speleothem measurements, a syringe injection unit was added to the line. Peak evaluation is done similarly as in gas chromatography to obtain &delta D; and δ18O isotopic compositions of measured water aliquots. Precision is better than 1.5 ‰ for δ D and 0.4 ‰ for δ18O for water measurements for an extended range (−210 to 0 ‰ for δ D and −27 to 0 ‰ for δ18O) primarily dependent on the amount of water released from speleothem fluid inclusions and secondarily on the isotopic composition of the sample. The results show that WS-CRDS technology is suitable for speleothem fluid inclusion measurements and gives results that are comparable to the isotope ratio mass spectrometry (IRMS) technique.
Resumo:
The present article examines production and on-line processing of definite articles in Turkish-speaking sequential bilingual children acquiring English and Dutch as second languages (L2) in the UK and in the Netherlands, respectively. Thirty-nine 6–8-year-old L2 children and 48 monolingual (L1) age-matched children participated in two separate studies examining the production of definite articles in English and Dutch in conditions manipulating semantic context, that is, the anaphoric and the bridging contexts. Sensitivity to article omission was examined in the same groups of children using an on-line processing task involving article use in the same semantic contexts as in the production task. The results indicate that both L2 children and L1 controls are less accurate when definiteness is established by keeping track of the discourse referents (anaphoric) than when it is established via world knowledge (bridging). Moreover, despite variable production, all groups of children were sensitive to the omission of definite articles in the on-line comprehension task. This suggests that the errors of omission are not due to the lack of abstract syntactic representations, but could result from processes implicated in the spell-out of definite articles. The findings are in line with the idea that variable production in child L2 learners does not necessarily indicate lack of abstract representations (Haznedar and Schwartz, 1997).
Resumo:
Obesity prevalence is increasing. The management of this condition requires a detailed analysis of the global risk factors in order to develop personalised advice. This study is aimed to identify current dietary patterns and habits in Spanish population interested in personalised nutrition and investigate associations with weight status. Self-reported dietary and anthropometrical data from the Spanish participants in the Food4Me study, were used in a multidimensional exploratory analysis to define specific dietary profiles. Two opposing factors were obtained according to food groups’ intake: Factor 1 characterised by a more frequent consumption of traditionally considered unhealthy foods; and Factor 2, where the consumption of “Mediterranean diet” foods was prevalent. Factor 1 showed a direct relationship with BMI (β = 0.226; r2 = 0.259; p < 0.001), while the association with Factor 2 was inverse (β = −0.037; r2 = 0.230; p = 0.348). A total of four categories were defined (Prudent, Healthy, Western, and Compensatory) through classification of the sample in higher or lower adherence to each factor and combining the possibilities. Western and Compensatory dietary patterns, which were characterized by high-density foods consumption, showed positive associations with overweight prevalence. Further analysis showed that prevention of overweight must focus on limiting the intake of known deleterious foods rather than exclusively enhance healthy products.