958 resultados para Quantitative EEG analysis
Resumo:
The purpose of this study was threefold: first, the study was designed to illustrate the use of data and information collected in food safety surveys in a quantitative risk assessment. In this case, the focus was on the food service industry; however, similar data from other parts of the food chain could be similarly incorporated. The second objective was to quantitatively describe and better understand the role that the food service industry plays in the safety of food. The third objective was to illustrate the additional decision-making information that is available when uncertainty and variability are incorporated into the modelling of systems. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Concerns of reduced productivity and land degradation in the Mitchell grasslands of central western Queensland were addressed through a range monitoring program to interpret condition and trend. Botanical and eclaphic parameters were recorded along piosphere and grazing gradients, and across fenceline impact areas, to maximise changes resulting from grazing. The Degradation Gradient Method was used in conjunction with State and Transition Models to develop models of rangeland dynamics and condition. States were found to be ordered along a degradation gradient, indicator species developed according to rainfall trends and transitions determined from field data and available literature. Astrebla spp. abundance declined with declining range condition and increasing grazing pressure, while annual grasses and forbs increased in dominance under poor range condition. Soil erosion increased and litter decreased with decreasing range condition. An approach to quantitatively define states within a variable rainfall environment based upon a time-series ordination analysis is described. The derived model could provide the interpretive framework necessary to integrate on-ground monitoring, remote sensing and geographic information systems to trace states and transitions at the paddock scale. However, further work is needed to determine the full catalogue of states and transitions and to refine the model for application at the paddock scale.
Resumo:
Enzymic catalysis proceeds via intermediates formed in the course of substrate conversion. Here, we directly detect key intermediates in thiamin diphosphate (ThDP)-dependent enzymes during catalysis using H-1 NMR spectroscopy. The quantitative analysis of the relative intermediate concentrations allows the determination of the microscopic rate constants of individual catalytic steps. As demonstrated for pyruvate decarboxylase (PDC), this method, in combination with site-directed mutagenesis, enables the assignment of individual side chains to single steps in catalysis. In PDC, two independent proton relay systems and the stereochemical control of the enzymic environment account for proficient catalysis proceeding via intermediates at carbon 2 of the enzyme-bound cofactor. The application of this method to other ThDP-dependent enzymes provides insight into their specific chemical pathways.
Resumo:
Primary objective : To investigate the speed and accuracy of tongue movements exhibited by a sample of children with dysarthria following severe traumatic brain injury (TBI) during speech using electromagnetic articulography (EMA). Methods and procedures : Four children, aged between 12.75-17.17 years with dysarthria following TBI, were assessed using the AG-100 electromagnetic articulography system (Carstens Medizinelektronik). The movement trajectories of receiver coils affixed to each child's tongue were examined during consonant productions, together with a range of quantitative kinematic parameters. The children's results were individually compared against the mean values obtained by a group of eight control children (mean age of 14.67 years, SD 1.60). Main outcomes and results : All four TBI children were perceived to exhibit reduced rates of speech and increased word durations. Objective EMA analysis revealed that two of the TBI children exhibited significantly longer consonant durations compared to the control group, resulting from different underlying mechanisms relating to speed generation capabilities and distances travelled. The other two TBI children did not exhibit increased initial consonant movement durations, suggesting that the vowels and/or final consonants may have been contributing to the increased word durations. Conclusions and clinical implications : The finding of different underlying articulatory kinematic profiles has important implications for the treatment of speech rate disturbances in children with dysarthria following TBI.
Resumo:
O Transtorno do Espectro do Autismo (TEA) caracteriza-se por uma série de distúrbios cognitivos e neurocomportamentais e sua prevalência mundial é estimada em 1 criança com TEA a cada 160 crianças com típico desenvolvimento (TD). Indivíduos com TEA apresentam dificuldade em interpretar as emoções alheias e em expressar sentimentos. As emoções podem ser associadas à manifestação de sinais fisiológicos, e, dentre eles, os sinais cerebrais têm sido muito abordados. A detecção dos sinais cerebrais de crianças com TEA pode ser benéfica para o esclarecimento de suas emoções e expressões. Atualmente, muitas pesquisas integram a robótica ao tratamento pedagógico do TEA, através da interação com crianças com esse transtorno, estimulando habilidades sociais, como a imitação e a comunicação. A avaliação dos estados mentais de crianças com TEA durante a sua interação com um robô móvel é promissora e assume um aspecto inovador. Assim, os objetivos deste trabalho foram captar sinais cerebrais de crianças com TEA e de crianças com TD, como grupo controle, para o estudo de seus estados emocionais e para avaliar seus estados mentais durante a interação com um robô móvel, e avaliar também a interação dessas crianças com o robô, através de escalas quantitativas. A técnica de registro dos sinais cerebrais escolhida foi a eletroencefalografia (EEG), a qual utiliza eletrodos colocados de forma não invasiva e não dolorosa sobre o couro cabeludo da criança. Os métodos para avaliar a eficiência do uso da robótica nessa interação foram baseados em duas escalas internacionais quantitativas: Escala de Alcance de Metas (do inglês Goal Attainment Scaling - GAS) e Escala de Usabilidade de Sistemas (do inglês System Usability Scale - SUS). Os resultados obtidos mostraram que, pela técnica de EEG, foi possível classificar os estados emocionais de crianças com TD e com TEA e analisar a atividade cerebral durante o início da interação com o robô, através dos ritmos alfa e beta. Com as avaliações GAS e SUS, verificou-se que o robô móvel pode ser considerado uma potencial ferramenta terapêutica para crianças com TEA.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Tese de Doutoramento em Biologia apresentada à Faculdade de Ciências da Universidade do Porto, 2015.
Resumo:
A capacidade de compreensão das acções dos outros e de imitação tem sido descrita como fundamental para a cognição social do ser humano. Recentemente tem sido atribuída a responsabilidade desta capacidade a um sistema neuronal denominado de Sistema de Neurónios Espelho, que se tem demonstrado estar afectado em perturbações mentais que se caracterizam por alterações severas da teoria da mente e da empatia, como é o caso do autismo. No caso do Síndrome de Down, verifica-se a coexistência de boas competências sociais e de capacidades práxicas e de imitação intactas, com dificuldades de interpretação de situações sociais e de reconhecimento de emoções, que nos levam a questionar acerca da actividade do seu Sistema de Neurónios Espelho. As oscilações do ritmo de frequências um (8-13 Hz) no córtex sensório-motor perante a observação de acções são consideradas um reflexo da actividade dos neurónios espelho, estando estabelecido que em pessoas saudáveis ocorre uma supressão mu na realização de movimentos com o membro superior e na sua observação quando realizados por outras pessoas. Neste estudo registou-se electroencefalograficamente a supressão dos ritmos mu em 11 pessoas com SD e em 20 pessoas sem SD nas seguintes condições: observação de um vídeo com duas bolas em movimento, observação de um vídeo com um movimento repetido de uma mão e realização movimentos com a mão. A baseline foi registada através da observação de um ponto estático. Constatamos que existe supressão dos ritmos mu na observação das acções dos outros em pessoas com Síndrome Down da mesma forma que ocorre na realização do próprio movimento, sugerindo uma relativa preservação do funcionamento dos neurónios espelho e dos mecanismos básicos de cognição social. Estes resultados vão de encontro aos estudos que apontam para a integridade das capacidades de imitação no Síndrome Down. Verificamos também que não se encontram diferenças significativas na supressão dos ritmos mu entre os grupos de pessoas com Síndrome Down e de Controlo em relação às condições usadas na investigação.
Resumo:
This paper studies the human DNA in the perspective of signal processing. Six wavelets are tested for analyzing the information content of the human DNA. By adopting real Shannon wavelet several fundamental properties of the code are revealed. A quantitative comparison of the chromosomes and visualization through multidimensional and dendograms is developed.
Resumo:
This paper analyzes the DNA code of several species in the perspective of information content. For that purpose several concepts and mathematical tools are selected towards establishing a quantitative method without a priori distorting the alphabet represented by the sequence of DNA bases. The synergies of associating Gray code, histogram characterization and multidimensional scaling visualization lead to a collection of plots with a categorical representation of species and chromosomes.
Resumo:
In studies assessing the effects of a given exposure variable and a specific outcome of interest, confusion may arise from the mistaken impression that the exposure variable is producing the outcome of interest, when in fact the observed effect is due to an existing confounder. However, quantitative techniques are rarely used to determine the potential influence of unmeasured confounders. Sensitivity analysis is a statistical technique that allows to quantitatively measuring the impact of an unmeasured confounding variable on the association of interest that is being assessed. The purpose of this study was to make it feasible to apply two sensitivity analysis methods available in the literature, developed by Rosenbaum and Greenland, using an electronic spreadsheet. Thus, it can be easier for researchers to include this quantitative tool in the set of procedures that have been commonly used in the stage of result validation.
Resumo:
LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.
Resumo:
This paper studies the information content of the chromosomes of twenty-three species. Several statistics considering different number of bases for alphabet character encoding are derived. Based on the resulting histograms, word delimiters and character relative frequencies are identified. The knowledge of this data allows moving along each chromosome while evaluating the flow of characters and words. The resulting flux of information is captured by means of Shannon entropy. The results are explored in the perspective of power law relationships allowing a quantitative evaluation of the DNA of the species.
Resumo:
This paper studies the chromosome information of twenty five species, namely, mammals, fishes, birds, insects, nematodes, fungus, and one plant. A quantifying scheme inspired in the state space representation of dynamical systems is formulated. Based on this algorithm, the information of each chromosome is converted into a bidimensional distribution. The plots are then analyzed and characterized by means of Shannon entropy. The large volume of information is integrated by averaging the lengths and entropy quantities of each species. The results can be easily visualized revealing quantitative global genomic information.
Resumo:
This study describes the change of the ultraviolet spectral bands starting from 0.1 to 5.0 nm slit width in the spectral range of 200–400 nm. The analysis of the spectral bands is carried out by using the multidimensional scaling (MDS) approach to reach the latent spectral background. This approach indicates that 0.1 nm slit width gives higher-order noise together with better spectral details. Thus, 5.0 nm slit width possesses the higher peak amplitude and lower-order noise together with poor spectral details. In the above-mentioned conditions, the main problem is to find the relationship between the spectral band properties and the slit width. For this aim, the MDS tool is to used recognize the hidden information of the ultraviolet spectra of sildenafil citrate by using a ShimadzuUV–VIS 2550, which is in theworld the best double monochromator instrument. In this study, the proposed mathematical approach gives the rich findings for the efficient use of the spectrophotometer in the qualitative and quantitative studies.