967 resultados para Processing methods
Resumo:
The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.
Resumo:
TLE in infancy has been the subject of varied research. Topographical and structural evidence is coincident with the neuronal systems responsible for auditory processing of the highest specialization and complexity. Recent studies have been showing the need of a hemispheric asymmetry for an optimization in central auditory processing (CAP) and acquisition and learning of a language system. A new functional research paradigm is required to study mental processes that require methods of cognitive-sensory information analysis processed in very short periods of time (msec), such as the ERPs. Thus, in this article, we hypothesize that the TLE in infancy could be a good model for topographic and functional study of CAP and its development process, contributing to a better understanding of the learning difficulties that children with this neurological disorder have.
Resumo:
Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.
Resumo:
Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Development and standardization of reliable methods for detection of Mycobacterium tuberculosis in clinical samples is an important goal in laboratories throughout the world. In this work, lung and spleen fragments from a patient who died with the diagnosis of miliary tuberculosis were used to evaluate the influence of the type of fixative as well as the fixation and paraffin inclusion protocols on PCR performance in paraffin embedded specimens. Tissue fragments were fixed for four h to 48 h, using either 10% non-buffered or 10% buffered formalin, and embedded in pure paraffin or paraffin mixed with bee wax. Specimens were submitted to PCR for amplification of the human beta-actin gene and separately for amplification of the insertion sequence IS6110, specific from the M. tuberculosis complex. Amplification of the beta-actin gene was positive in all samples. No amplicons were generated by PCR-IS6110 when lung tissue fragments were fixed using 10% non-buffered formalin and were embedded in paraffin containing bee wax. In conclusion, combined inhibitory factors interfere in the detection of M. tuberculosis in stored material. It is important to control these inhibitory factors in order to implement molecular diagnosis in pathology laboratories.
Resumo:
Thesis submitted in the fulfillment of the requirements for the Degree of Master in Biomedical Engineering
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Mecânica
Resumo:
For years, silk fibroin of a domestic silkworm, Bombyx mori, has been recognized as a valuable material and extensively used. In the last decades, new application fields are emerging for this versatile material. Those final, specific applications of silk dictate the way it has been processed in industry and research. This review focuses on the description of various approaches for silk downstream processing in a laboratory scale, that fall within several categories. The detailed description of workflow possibilities from the naturally found material to a finally formulated product is presented. Considerable attention is given to (bio-) chemical approaches of silk fibroin transformation, particularly, to its enzyme-driven modifications. The focus of the current literature survey is exclusively on the methods applied in research and not industry.
Resumo:
Biosignals processing, Biological Nonlinear and time-varying systems identification, Electomyograph signals recognition, Pattern classification, Fuzzy logic and neural networks methods
Resumo:
In this work we report four different destructive and non-destructive methods for detecting picorna-like virus particles in triatomines. The methods are based on direct observation under transmission electron microscope and they consist of four ways to prepare samples of presumable infected material. The samples are prepared processing dead or alive insect parts, or even dry or fresh insect feces. The methods can be used as analytical or preparative techniques, for quantifying virus infection and checking virus integrity as well. In this work the four methods are applied in order to detect Triatoma virus (TrV) particles in T. infestans colonies.
Resumo:
Oscillations have been increasingly recognized as a core property of neural responses that contribute to spontaneous, induced, and evoked activities within and between individual neurons and neural ensembles. They are considered as a prominent mechanism for information processing within and communication between brain areas. More recently, it has been proposed that interactions between periodic components at different frequencies, known as cross-frequency couplings, may support the integration of neuronal oscillations at different temporal and spatial scales. The present study details methods based on an adaptive frequency tracking approach that improve the quantification and statistical analysis of oscillatory components and cross-frequency couplings. This approach allows for time-varying instantaneous frequency, which is particularly important when measuring phase interactions between components. We compared this adaptive approach to traditional band-pass filters in their measurement of phase-amplitude and phase-phase cross-frequency couplings. Evaluations were performed with synthetic signals and EEG data recorded from healthy humans performing an illusory contour discrimination task. First, the synthetic signals in conjunction with Monte Carlo simulations highlighted two desirable features of the proposed algorithm vs. classical filter-bank approaches: resilience to broad-band noise and oscillatory interference. Second, the analyses with real EEG signals revealed statistically more robust effects (i.e. improved sensitivity) when using an adaptive frequency tracking framework, particularly when identifying phase-amplitude couplings. This was further confirmed after generating surrogate signals from the real EEG data. Adaptive frequency tracking appears to improve the measurements of cross-frequency couplings through precise extraction of neuronal oscillations.
Resumo:
Brain perfusion can be assessed by CT and MR. For CT, two major techniques are used. First, Xenon CT is an equilibrium technique based on a freely diffusible tracer. First pass of iodinated contrast injected intravenously is a second method, more widely available. Both methods are proven to be robust and quantitative, thanks to the linear relationship between contrast concentration and x-ray attenuation. For the CT methods, concern regarding x-ray doses delivered to the patients need to be addressed. MR is also able to assess brain perfusion using the first pass of gadolinium based contrast agent injected intravenously. This method has to be considered as a semi-quantitative because of the non linear relationship between contrast concentration and MR signal changes. Arterial spin labeling is another MR method assessing brain perfusion without injection of contrast. In such case, the blood flow in the carotids is magnetically labelled by an external radiofrequency pulse and observed during its first pass through the brain. Each of this various CT and MR techniques have advantages and limits that will be illustrated and summarized.Learning Objectives:1. To understand and compare the different techniques for brain perfusion imaging.2. To learn about the methods of acquisition and post-processing of brain perfusion by first pass of contrast agent for CT and MR.3. To learn about non contrast MR methods (arterial spin labelling).
Resumo:
Human electrophysiological studies support a model whereby sensitivity to so-called illusory contour stimuli is first seen within the lateral occipital complex. A challenge to this model posits that the lateral occipital complex is a general site for crude region-based segmentation, based on findings of equivalent hemodynamic activations in the lateral occipital complex to illusory contour and so-called salient region stimuli, a stimulus class that lacks the classic bounding contours of illusory contours. Using high-density electrical mapping of visual evoked potentials, we show that early lateral occipital cortex activity is substantially stronger to illusory contour than to salient region stimuli, whereas later lateral occipital complex activity is stronger to salient region than to illusory contour stimuli. Our results suggest that equivalent hemodynamic activity to illusory contour and salient region stimuli probably reflects temporally integrated responses, a result of the poor temporal resolution of hemodynamic imaging. The temporal precision of visual evoked potentials is critical for establishing viable models of completion processes and visual scene analysis. We propose that crude spatial segmentation analyses, which are insensitive to illusory contours, occur first within dorsal visual regions, not the lateral occipital complex, and that initial illusory contour sensitivity is a function of the lateral occipital complex.