876 resultados para bigdata, data stream processing, dsp, apache storm, cyber security


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work proposes a method based on both preprocessing and data mining with the objective of identify harmonic current sources in residential consumers. In addition, this methodology can also be applied to identify linear and nonlinear loads. It should be emphasized that the entire database was obtained through laboratory essays, i.e., real data were acquired from residential loads. Thus, the residential system created in laboratory was fed by a configurable power source and in its output were placed the loads and the power quality analyzers (all measurements were stored in a microcomputer). So, the data were submitted to pre-processing, which was based on attribute selection techniques in order to minimize the complexity in identifying the loads. A newer database was generated maintaining only the attributes selected, thus, Artificial Neural Networks were trained to realized the identification of loads. In order to validate the methodology proposed, the loads were fed both under ideal conditions (without harmonics), but also by harmonic voltages within limits pre-established. These limits are in accordance with IEEE Std. 519-1992 and PRODIST (procedures to delivery energy employed by Brazilian`s utilities). The results obtained seek to validate the methodology proposed and furnish a method that can serve as alternative to conventional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electromagnetic suspension systems are inherently nonlinear and often face hardware limitation when digitally controlled. The main contributions of this paper are: the design of a nonlinear H(infinity) controller. including dynamic weighting functions, applied to a large gap electromagnetic suspension system and the presentation of a procedure to implement this controller on a fixed-point DSP, through a methodology able to translate a floating-point algorithm into a fixed-point algorithm by using l(infinity) norm minimization due to conversion error. Experimental results are also presented, in which the performance of the nonlinear controller is evaluated specifically in the initial suspension phase. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We hypothesized that the processing of auditory information by the perisylvian polymicrogyric cortex may be different from the normal cortex. To characterize the auditory processing in bilateral perisylvian syndrome, we examined ten patients with perisylvian polymicrogyria (Group 1) and seven control children (Group 11). Group I was composed by four children with bilateral perisylvian polymicrogyria and six children with bilateral posterior perisylvian polymicrogyria. The evaluation included neurological and neuroimaging investigation, intellectual quotient and audiological assessment (audiometry and behavior auditory tests). The results revealed a statistically significant difference between the groups in the behavioral auditory tests, Such as, digits dichotic test, nonverbal dichotic test (specifically in right attention), and random gap detection/random gap detection expanded tests. Our data showed abnormalities in the auditory processing of children with perisylvian polymicrogyria, suggesting that perisylvian polymicrogyric cortex is functionally abnormal. We also found a correlation between the severity of our auditory findings and the extent of the cortical abnormality. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditionally the basal ganglia have been implicated in motor behavior, as they are involved in both the execution of automatic actions and the modification of ongoing actions in novel contexts. Corresponding to cognition, the role of the basal ganglia has not been defined as explicitly. Relative to linguistic processes, contemporary theories of subcortical participation in language have endorsed a role for the globus pallidus internus (GPi) in the control of lexical-semantic operations. However, attempts to empirically validate these postulates have been largely limited to neuropsychological investigations of verbal fluency abilities subsequent to pallidotomy. We evaluated the impact of bilateral posteroventral pallidotomy (BPVP) on language function across a range of general and high-level linguistic abilities, and validated/extended working theories of pallidal participation in language. Comprehensive linguistic profiles were compiled up to 1 month before and 3 months after BPVP in 6 subjects with Parkinson's disease (PD). Commensurate linguistic profiles were also gathered over a 3-month period for a nonsurgical control cohort of 16 subjects with PD and a group of 16 non-neurologically impaired controls (NC). Nonparametric between-groups comparisons were conducted and reliable change indices calculated, relative to baseline/3-month follow-up difference scores. Group-wise statistical comparisons between the three groups failed to reveal significant postoperative changes in language performance. Case-by-case data analysis relative to clinically consequential change indices revealed reliable alterations in performance across several language variables as a consequence of BPVP. These findings lend support to models of subcortical participation in language, which promote a role for the GPi in lexical-semantic manipulation mechanisms. Concomitant improvements and decrements in postoperative performance were interpreted within the context of additive and subtractive postlesional effects. Relative to parkinsonian cohorts, clinically reliable versus statistically significant changes on a case by case basis may provide the most accurate method of characterizing the way in which pathophysiologically divergent basal ganglia linguistic circuits respond to BPVP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of computational fluid dynamics simulations for calibrating a flush air data system is described, In particular, the flush air data system of the HYFLEX hypersonic vehicle is used as a case study. The HYFLEX air data system consists of nine pressure ports located flush with the vehicle nose surface, connected to onboard pressure transducers, After appropriate processing, surface pressure measurements can he converted into useful air data parameters. The processing algorithm requires an accurate pressure model, which relates air data parameters to the measured pressures. In the past, such pressure models have been calibrated using combinations of flight data, ground-based experimental results, and numerical simulation. We perform a calibration of the HYFLEX flush air data system using computational fluid dynamics simulations exclusively, The simulations are used to build an empirical pressure model that accurately describes the HYFLEX nose pressure distribution ol cr a range of flight conditions. We believe that computational fluid dynamics provides a quick and inexpensive way to calibrate the air data system and is applicable to a broad range of flight conditions, When tested with HYFLEX flight data, the calibrated system is found to work well. It predicts vehicle angle of attack and angle of sideslip to accuracy levels that generally satisfy flight control requirements. Dynamic pressure is predicted to within the resolution of the onboard inertial measurement unit. We find that wind-tunnel experiments and flight data are not necessary to accurately calibrate the HYFLEX flush air data system for hypersonic flight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligence (IQ) can be seen as the efficiency of mental processes or cognition, as can basic information processing (IP) tasks like those used in our ongoing Memory, Attention and Problem Solving (MAPS) study. Measures of IQ and IP are correlated and both have a genetic component, so we are studying how the genetic variance in IQ is related to the genetic variance in IP. We measured intelligence with five subscales of the Multidimensional Aptitude Battery (MAB). The IP tasks included four variants of choice reaction time (CRT) and a visual inspection time (IT). The influence of genetic factors on the variances in each of the IQ, IP, and IT tasks was investigated in 250 identical and nonidentical twin pairs aged 16 years. For a subset of 50 pairs we have test–retest data that allow us to estimate the stability of the measures. MX was used for a multivariate genetic analysis that addresses whether the variance in IQ and IP measures is possibly mediated by common genetic factors. Analyses that show the modeled genetic and environmental influences on these measures of cognitive efficiency will be presented and their relevance to ideas on intelligence will be discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The compound eyes of mantis shrimps, a group of tropical marine crustaceans, incorporate principles of serial and parallel processing of visual information that may be applicable to artificial imaging systems. Their eyes include numerous specializations for analysis of the spectral and polarizational properties of light, and include more photoreceptor classes for analysis of ultraviolet light, color, and polarization than occur in any other known visual system. This is possible because receptors in different regions of the eye are anatomically diverse and incorporate unusual structural features, such as spectral filters, not seen in other compound eyes. Unlike eyes of most other animals, eyes of mantis shrimps must move to acquire some types of visual information and to integrate color and polarization with spatial vision. Information leaving the retina appears to be processed into numerous parallel data streams leading into the central nervous system, greatly reducing the analytical requirements at higher levels. Many of these unusual features of mantis shrimp vision may inspire new sensor designs for machine vision

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of functional neuroimaging techniques, in particular functional magnetic resonance imaging (fMRI), we have gained greater insight into the neural correlates of visuospatial function. However, it may not always be easy to identify the cerebral regions most specifically associated with performance on a given task. One approach is to examine the quantitative relationships between regional activation and behavioral performance measures. In the present study, we investigated the functional neuroanatomy of two different visuospatial processing tasks, judgement of line orientation and mental rotation. Twenty-four normal participants were scanned with fMRI using blocked periodic designs for experimental task presentation. Accuracy and reaction time (RT) to each trial of both activation and baseline conditions in each experiment was recorded. Both experiments activated dorsal and ventral visual cortical areas as well as dorsolateral prefrontal cortex. More regionally specific associations with task performance were identified by estimating the association between (sinusoidal) power of functional response and mean RT to the activation condition; a permutation test based on spatial statistics was used for inference. There was significant behavioral-physiological association in right ventral extrastriate cortex for the line orientation task and in bilateral (predominantly right) superior parietal lobule for the mental rotation task. Comparable associations were not found between power of response and RT to the baseline conditions of the tasks. These data suggest that one region in a neurocognitive network may be most strongly associated with behavioral performance and this may be regarded as the computationally least efficient or rate-limiting node of the network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Antigen recognition by cytotoxic CD8 T cells is dependent upon a number of critical steps in MHC class I antigen processing including proteosomal cleavage, TAP transport into the endoplasmic reticulum, and MHC class 1 binding. Based on extensive experimental data relating to each of these steps there is now the capacity to model individual antigen processing steps with a high degree of accuracy. This paper demonstrates the potential to bring together models of individual antigen processing steps, for example proteosome cleavage, TAP transport, and MHC binding, to build highly informative models of functional pathways. In particular, we demonstrate how an artificial neural network model of TAP transport was used to mine a HLA-binding database so as to identify H LA-binding peptides transported by TAP. This integrated model of antigen processing provided the unique insight that HLA class I alleles apparently constitute two separate classes: those that are TAP-efficient for peptide loading (HLA-B27, -A3, and -A24) and those that are TAP-inefficient (HLA-A2, -B7, and -B8). Hence, using this integrated model we were able to generate novel hypotheses regarding antigen processing, and these hypotheses are now capable of being tested experimentally. This model confirms the feasibility of constructing a virtual immune system, whereby each additional step in antigen processing is incorporated into a single modular model. Accurate models of antigen processing have implications for the study of basic immunology as well as for the design of peptide-based vaccines and other immunotherapies. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of biomonitoring programs based on the macroinvertebrate community requires the understanding of species distribution patterns, as well as of the responses of the community to anthropogenic stressors. In this study, 49 metrics were tested as potential means of assessing the condition of 29 first- and second-order streams located in areas of differing types of land use in So Paulo State, Brazil. Of the sampled streams, 15 were in well-preserved regions in the Atlantic Forest, 5 were among sugarcane cultivations, 5 were in areas of pasture, and 4 were among eucalyptus plantations. The metrics were assessed against the following criteria: (1) predictable response to the impact of human activity; (2) highest taxonomic resolution, and (3) operational and theoretical simplicity. We found that 18 metrics were correlated with the environmental and spatial predictors used, and seven of these satisfied the selection criteria and are thus candidates for inclusion in a multimetric system to assess low-order streams in So Paulo State. These metrics are family richness; Ephemeroptera, Plecoptera and Trichoptera (EPT) richness; proportion of Megaloptera and Hirudinea; proportion of EPT; Shannon diversity index for genus; and adapted Biological Monitoring Work Party biotic index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The settling characteristics of cell debris and inclusion bodies prior to, and following, fractionation in a disc-stack centrifuge were measured using Cumulative Sedimentation Analysis (CSA) and Centrifugal Disc photosedimentation (CDS). The impact of centrifuge feedrate and repeated homogenisation on both cell debris and inclusion body collection efficiency was investigated. Increasing the normalised centrifuge feedrate (Q/Sigma) from 1.32 x 10(-9) m s(-1) to 3.97 x 10(-9) m s(-1) leads to a 36% increase in inclusion body paste purity. Purity may also be improved by repeated homogenisation. Increasing the number of homogeniser passes results in smaller cell debris size whilst leaves inclusion body size unaltered. At a normalised centrifuge feedrate of 2.65 x 10(-9) m s(-1), increasing the number of homogeniser passes from two (2) to ten (10) improved overall inclusion body paste purity by 58%. Grade-efficiency curves for both the cell debris and inclusion bodies have also been generated in this study. The data are described using an equation developed by Mannweiler (1989) with parameters of k = 0.15-0.26 and n = 2.5-2.6 for inclusion bodies, and k = 0.12-0.14 and n = 2.0-2.2 for cell debris. This is the first accurate experimentally-determined grade efficiency curve for cell debris. Previous studies have simply estimated debris grade efficiency curves using an approximate debris size distribution and grade efficiency curves determined with 'ideal particles' (e.g. spherical PVA particles). The findings of this study may be used to simulate and optimise the centrifugal fractionation of inclusion bodies from cell debris.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We assess the effects of chemical processing, ethylene oxide sterilization, and threading on bone surface and mechanical properties of bovine undecalcified bone screws. In addition, we evaluate the possibility of manufacturing bone screws with predefined dimensions. Scanning electronic microscopic images show that chemical processing and ethylene oxide treatment causes collagen fiber amalgamation on the bone surface. Processed screws hold higher ultimate loads under bending and torsion than the in natura bone group, with no change in pull-out strength between groups. Threading significantly reduces deformation and bone strength under torsion. Metrological data demonstrate the possibility of manufacturing bone screws with standardized dimensions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two hazard risk assessment matrices for the ranking of occupational health risks are described. The qualitative matrix uses qualitative measures of probability and consequence to determine risk assessment codes for hazard-disease combinations. A walk-through survey of an underground metalliferous mine and concentrator is used to demonstrate how the qualitative matrix can be applied to determine priorities for the control of occupational health hazards. The semi-quantitative matrix uses attributable risk as a quantitative measure of probability and uses qualitative measures of consequence. A practical application of this matrix is the determination of occupational health priorities using existing epidemiological studies. Calculated attributable risks from epidemiological studies of hazard-disease combinations in mining and minerals processing are used as examples. These historic response data do not reflect the risks associated with current exposures. A method using current exposure data, known exposure-response relationships and the semi-quantitative matrix is proposed for more accurate and current risk rankings.