817 resultados para Error of measurement
Resumo:
his paper investigates the identification and output tracking control of a class of Hammerstein systems through a wireless network within an integrated framework and the statistic characteristics of the wireless network are modelled using the inverse Gaussian cumulative distribution function. In the proposed framework, a new networked identification algorithm is proposed to compensate for the influence of the wireless network delays so as to acquire the more precise Hammerstein system model. Then, the identified model together with the model-based approach is used to design an output tracking controller. Mean square stability conditions are given using linear matrix inequalities (LMIs) and the optimal controller gains can be obtained by solving the corresponding optimization problem expressed using LMIs. Illustrative numerical simulation examples are given to demonstrate the effectiveness of our proposed method.
Resumo:
This paper is concerned with the analysis of the stability of delayed recurrent neural networks. In contrast to the widely used Lyapunov–Krasovskii functional approach, a new method is developed within the integral quadratic constraints framework. To achieve this, several lemmas are first given to propose integral quadratic separators to characterize the original delayed neural network. With these, the network is then reformulated as a special form of feedback-interconnected system by choosing proper integral quadratic constraints. Finally, new stability criteria are established based on the proposed approach. Numerical examples are given to illustrate the effectiveness of the new approach.
Resumo:
This paper investigates camera control for capturing bottle cap target images in the fault-detection system of an industrial production line. The main purpose is to identify the targeted bottle caps accurately in real time from the images. This is achieved by combining iterative learning control and Kalman filtering to reduce the effect of various disturbances introduced into the detection system. A mathematical model, together with a physical simulation platform is established based on the actual production requirements, and the convergence properties of the model are analyzed. It is shown that the proposed method enables accurate real-time control of the camera, and further, the gain range of the learning rule is also obtained. The numerical simulation and experimental results confirm that the proposed method can not only reduce the effect of repeatable disturbances but also non-repeatable ones.
Resumo:
Highway structures such as bridges are subject to continuous degradation primarily due to ageing and environmental factors. A rational transport policy requires the monitoring of this transport infrastructure to provide adequate maintenance and guarantee the required levels of transport service and safety. In Europe, this is now a legal requirement - a European Directive requires all member states of the European Union to implement a Bridge Management System. However, the process is expensive, requiring the installation of sensing equipment and data acquisition electronics on the bridge. This paper investigates the use of an instrumented vehicle fitted with accelerometers on its axles to monitor the dynamic behaviour of bridges as an indicator of its structural condition. This approach eliminates the need for any on-site installation of measurement equipment. A simplified half-car vehicle-bridge interaction model is used in theoretical simulations to test the possibility of extracting the dynamic parameters of the bridge from the spectra of the vehicle accelerations. The effect of vehicle speed, vehicle mass and bridge span length on the detection of the bridge dynamic parameters are investigated. The algorithm is highly sensitive to the condition of the road profile and simulations are carried out for both smooth and rough profiles
Resumo:
Immunohistochemistry (IHC) is a widely available and highly utilised tool in diagnostic histopathology and is used to guide treatment options as well as provide prognostic information. IHC is subjected to qualitative and subjective assessment, which has been criticised for a lack of stringency, while PCR-based molecular diagnostic validations by comparison are regarded as very rigorous. It is essential that IHC tests are validated through evidence-based procedures. With the move to ISO15189 (2012), not just of the accuracy, specificity and reproducibility of each test need to be determined and managed, but also the degree of uncertainty and the delivery of such tests. The recent update to ISO 15189 (2012) states that it is appropriate to consider the potential uncertainty of measurement of the value obtained in the laboratory and how that may impact on prognostic or predictive thresholds. In order to highlight the problems surrounding IHC validity, we reviewed the measurement of Ki67and p53 in the literature. Both of these biomarkers have been incorporated into clinical care by pathology laboratories worldwide. The variation seen appears excessive even when measuring centrally stained slides from the same cases. We therefore propose in this paper to establish the basis on which IHC laboratories can bring the same level of robust validation seen in the molecular pathology laboratories and the principles applied to all routine IHC tests.
Resumo:
The contemporary literature investigating the construct broadly known as time perspective is replete with methodological and conceptual concerns. These concerns focus on the reliability and factorial validity of measurement tools, and the sample-specific modification of scales. These issues continue to hamper the development of this potentially useful psychological construct. An emerging body of evidence has supported the six-factor structure of scores on the Adolescent Time Inventory-Time Attitudes Scale, as well as their reliability. The present study utilized data from the first wave of a longitudinal study in the United Kingdom to examine the reliability, validity, and cross-cultural invariance of the scale. Results showed that the hypothesized six-factor model provided the best fit for the data; all alpha and omega estimates were >. .70; scores on ATI-TA factors related meaningfully to self-efficacy scores; and the factor structure was invariant across both research sites. Results are discussed in the context of the extant temporal literature.
Resumo:
The fractional calculus of variations and fractional optimal control are generalizations of the corresponding classical theories, that allow problem modeling and formulations with arbitrary order derivatives and integrals. Because of the lack of analytic methods to solve such fractional problems, numerical techniques are developed. Here, we mainly investigate the approximation of fractional operators by means of series of integer-order derivatives and generalized finite differences. We give upper bounds for the error of proposed approximations and study their efficiency. Direct and indirect methods in solving fractional variational problems are studied in detail. Furthermore, optimality conditions are discussed for different types of unconstrained and constrained variational problems and for fractional optimal control problems. The introduced numerical methods are employed to solve some illustrative examples.
Resumo:
This thesis describes the design and implementation of a reliable centimeter-level indoor positioning system fully compatible with a conventional smartphone. The proposed system takes advantage of the smartphone audio I/O and processing capabilities to perform acoustic ranging in the audio band using non-invasive audio signals and it has been developed having in mind applications that require high accuracy, such as augmented reality, virtual reality, gaming and audio guides. The system works in a distributed operation mode, i.e. each smartphone is able to obtain its own position using only acoustic signals. To support the positioning system, a Wireless Sensor Network (WSN) of synchronized acoustic beacons is used. To keep the infrastructure in sync we have developed an Automatic Time Synchronization and Syntonization (ATSS) protocol with a standard deviation of the sync offset error below 1.25 μs. Using an improved Time Difference of Arrival (TDoA) estimation approach (which takes advantage of the beacon signals’ periodicity) and by performing Non-Line-of-Sight (NLoS) mitigation, we were able to obtain very stable and accurate position estimates with an absolute mean error of less than 10 cm in 95% of the cases and a mean standard deviation of 2.2 cm for a position refresh period of 350 ms.
Resumo:
Phenylketonuria is an inborn error of metabolism, involving, in most cases, a deficient activity of phenylalanine hydroxylase. Neonatal diagnosis and a prompt special diet (low phenylalanine and natural-protein restricted diets) are essential to the treatment. The lack of data concerning phenylalanine contents of processed foodstuffs is an additional limitation for an already very restrictive diet. Our goals were to quantify protein (Kjeldahl method) and amino acid (18) content (HPLC/fluorescence) in 16 dishes specifically conceived for phenylketonuric patients, and compare the most relevant results with those of several international food composition databases. As might be expected, all the meals contained low protein levels (0.67–3.15 g/100 g) with the highest ones occurring in boiled rice and potatoes. These foods also contained the highest amounts of phenylalanine (158.51 and 62.65 mg/100 g, respectively). In contrast to the other amino acids, it was possible to predict phenylalanine content based on protein alone. Slight deviations were observed when comparing results with the different food composition databases.
Resumo:
INTRODUCTION AND OBJECTIVES: Recurrent syncope has a significant impact on quality of life. The development of measurement scales to assess this impact that are easy to use in clinical settings is crucial. The objective of the present study is a preliminary validation of the Impact of Syncope on Quality of Life questionnaire for the Portuguese population. METHODS: The instrument underwent a process of translation, validation, analysis of cultural appropriateness and cognitive debriefing. A population of 39 patients with a history of recurrent syncope (>1 year) who underwent tilt testing, aged 52.1 ± 16.4 years (21-83), 43.5% male, most in active employment (n=18) or retired (n=13), constituted a convenience sample. The resulting Portuguese version is similar to the original, with 12 items in a single aggregate score, and underwent statistical validation, with assessment of reliability, validity and stability over time. RESULTS: With regard to reliability, the internal consistency of the scale is 0.9. Assessment of convergent and discriminant validity showed statistically significant results (p<0.01). Regarding stability over time, a test-retest of this instrument at six months after tilt testing with 22 patients of the sample who had not undergone any clinical intervention found no statistically significant changes in quality of life. CONCLUSIONS: The results indicate that this instrument is of value for assessing quality of life in patients with recurrent syncope in Portugal.
Resumo:
17β-hydroxysteroid dehydrogenase 10 (HSD10) deficiency is a rare X-linked inborn error of isoleucine catabolism. Although this protein has been genetically implicated in Alzheimer's disease pathogenesis, studies of amyloid-β peptide (Aβ) in patients with HSD10 deficiency have not been previously reported. We found, in a severely affected child with HSD10 deficiency, undetectable levels of Aβ in the cerebrospinal fluid, together with low expression of brain-derived neurotrophic factor, α-synuclein, and serotonin metabolites. Confirmation of these findings in other patients would help elucidating mechanisms of synaptic dysfunction in this disease, and highlight the role of Aβ in both early and late periods of life.
Resumo:
Patient adherence to medications has been an issue challenging healthcare professionals for decades. Adherence rates, causes of non-adherence, barriers and enablers to medication taking, interventions to promote adherence, and the impact of non-adherence on health outcomes, have been extensively studied. In light of this, the area of adherence research has progressed conceptually and practically. This special issue contains a range of articles which focus on different aspects of adherence, from standardising terminology and methods of measurement, to non-adherence in a broad range of patient populations, and to interventions to promote adherence.
Resumo:
Cerebral metabolism is compartmentalized between neurons and glia. Although glial glycolysis is thought to largely sustain the energetic requirements of neurotransmission while oxidative metabolism takes place mainly in neurons, this hypothesis is matter of debate. The compartmentalization of cerebral metabolic fluxes can be determined by (13)C nuclear magnetic resonance (NMR) spectroscopy upon infusion of (13)C-enriched compounds, especially glucose. Rats under light α-chloralose anesthesia were infused with [1,6-(13)C]glucose and (13)C enrichment in the brain metabolites was measured by (13)C NMR spectroscopy with high sensitivity and spectral resolution at 14.1 T. This allowed determining (13)C enrichment curves of amino acid carbons with high reproducibility and to reliably estimate cerebral metabolic fluxes (mean error of 8%). We further found that TCA cycle intermediates are not required for flux determination in mathematical models of brain metabolism. Neuronal tricarboxylic acid cycle rate (V(TCA)) and neurotransmission rate (V(NT)) were 0.45 ± 0.01 and 0.11 ± 0.01 μmol/g/min, respectively. Glial V(TCA) was found to be 38 ± 3% of total cerebral oxidative metabolism, accounting for more than half of neuronal oxidative metabolism. Furthermore, glial anaplerotic pyruvate carboxylation rate (V(PC)) was 0.069 ± 0.004 μmol/g/min, i.e., 25 ± 1% of the glial TCA cycle rate. These results support a role of glial cells as active partners of neurons during synaptic transmission beyond glycolytic metabolism.
Resumo:
Les pyréthrinoïdes et les pyréthrines sont des insecticides neurotoxiques auxquels on attribue également des effets néfastes sur les systèmes immunitaire, hormonal et reproducteur. Ils sont abondamment utilisés en agriculture, mais aussi en horticulture, en extermination et dans le traitement d’infestations parasitaires humaines et animales (gale, poux). Il y a donc un intérêt en santé environnementale à connaître l’ampleur de l’exposition humaine à ces pesticides. La mesure des métabolites urinaires des pyréthrinoïdes et des pyréthrines apparaît une approche de choix pour arriver à cette fin puisqu’elle permet, en théorie, d’obtenir un portrait global de l’exposition. Or,traditionnellement et par soucis de simplicité les concentrations volumiques ou ajustées pour la créatinine) de ces biomarqueurs dans des urines ponctuelles sont déterminées, mais l’effet de l’utilisation de ces unités sur la validité des estimations de dose quotidienne absorbée n’a jamais été vérifié. L’objectif général de cette thèse était donc de développer, appliquer et valider une nouvelle stratégie de mesure et d’analyse de biomarqueurs pour améliorer la précision et la fiabilité des évaluations de l’exposition individuelles et populationnelles aux pyréthrinoïdes et pyréthrines. Les objectifs spécifiques étaient : i) de caractériser l’exposition humaine à ces contaminants en région urbaine et rurale au Québec et ii) de comparer la validité de la nouvelle stratégie de mesure et d’analyse de biomarqueurs urinaires proposée avec les autres méthodes plus usuelles utilisées pour estimer l’exposition individuelle et populationnelle et les doses absorbées de pyréthrinoïdes. Des adultes et des enfants, recrutés dans la population de l’Île de Montréal et de la Montérégie ont recueilli leurs urines pendant une période d’au moins douze heures et complété un questionnaire documentant les sources potentielles d’exposition. Les quantités de métabolites de pyréthrinoïdes et pyréthrines (pmol) mesurées dans les urines ont été ajustées pour une période de douze heures exactement et pour le poids corporel. Les quantités excrétées en région urbaine ont été comparées à celles excrétées en région rurale et les données individuelles et populationnelles ont été comparées à celles qui auraient été obtenues si des concentrations volumiques ou ajustées pour la créatinine avaient été mesurées. Les résultats montrent que l’exposition à ces pesticides est ubiquiste puisque plus de 90% des participants excrétaient les principaux métabolites de pyréthrinoïdes et pyréthrines à un niveau supérieur au seuil de détection analytique. Les résultats suggèrent que l’alimentation pourrait être à l’origine de ce niveau de base puisque les autres sources d’exposition connues n’ont été que rarement rapportées. Au Québec, l’exposition en milieu rural apparaissait légèrement plus importante qu’en milieu urbain et certains facteurs d’exposition, comme l’utilisation de pesticides domestiques, ont été rapportés plus fréquemment en milieu rural. Enfin, il a été démontré que la mesure des concentrations volumiques ou ajustées pour la créatinine de biomarqueurs est une approche qui peut entraîner des biais importants (jusqu’à 500% d’erreur et plus) en particulier lors de l’évaluation de l’exposition individuelle. Il est évident que les autorités de santé publique et de santé environnementale employant des biomarqueurs urinaires afin d’évaluer l’exposition aux pyréthrinoïdes et aux pyréthrines (ainsi qu’à d’autres molécules ayant des demi-vies d’élimination similaire)devraient diriger leurs efforts vers la mesure des quantités excrétées pendant une période d’au moins douze heures pour obtenir un portrait le plus valide possible de l’exposition. Il serait aussi souhaitable de mieux documenter la toxicocinétique de ces molécules chez l’humain afin d’établir avec une plus grande confiance la relation entre les quantités excrétées de biomarqueurs et les doses absorbées de ces contaminants.
Resumo:
L’instrument le plus fréquemment utilisé pour la mesure de l’amplitude de mouvement du coude est le goniomètre universel. Or celui-ci ne fait pas l’unanimité : plusieurs auteurs remettent en question sa fiabilité et validité. Cette étude détaille donc, en trois étapes, une alternative beaucoup plus précise et exacte : une méthode radiographique de mesure. Une étude de modélisation a d’abord permis de repérer les sources d’erreur potentielles de cette méthode radiographique, à ce jour jamais utilisée pour le coude. La méthode a ensuite servi à évaluer la validité du goniomètre. À cette fin, 51 volontaires ont participé à une étude clinique où les deux méthodes ont été confrontées. Finalement, la mesure radiographique a permis de lever le voile sur l’influence que peuvent avoir différents facteurs démographiques sur l’amplitude de mouvement du coude. La méthode radiographique s’est montrée robuste et certaines sources d’erreurs facilement évitables ont été identifiées. En ce qui concerne l’étude clinique, l’erreur de mesure attribuable au goniomètre était de ±10,3° lors de la mesure du coude en extension et de ±7,0° en flexion. L’étude a également révélé une association entre l’amplitude de mouvement et différents facteurs, dont les plus importants sont l’âge, le sexe, l’IMC et la circonférence du bras et de l’avant-bras. En conclusion, l’erreur du goniomètre peut être tolérée en clinique, mais son utilisation est cependant déconseillée en recherche, où une erreur de mesure de l’ordre de 10° est inacceptable. La méthode radiographique, étant plus précise et exacte, représente alors une bien meilleure alternative.