849 resultados para Surface-based analysis
Resumo:
We developed and validated a new method to create automated 3D parametric surface models of the lateral ventricles, designed for monitoring degenerative disease effects in clinical neuroscience studies and drug trials. First we used a set of parameterized surfaces to represent the ventricles in a manually labeled set of 9 subjects' MRIs (atlases). We fluidly registered each of these atlases and mesh models to a set of MRIs from 12 Alzheimer's disease (AD) patients and 14 matched healthy elderly subjects, and we averaged the resulting meshes for each of these images. Validation experiments on expert segmentations showed that (1) the Hausdorff labeling error rapidly decreased, and (2) the power to detect disease-related alterations monotonically improved as the number of atlases, N, was increased from 1 to 9. We then combined the segmentations with a radial mapping approach to localize ventricular shape differences in patients. In surface-based statistical maps, we detected more widespread and intense anatomical deficits as we increased the number of atlases, and we formulated a statistical stopping criterion to determine the optimal value of N. Anterior horn anomalies in Alzheimer's patients were only detected with the multi-atlas segmentation, which clearly outperformed the standard single-atlas approach.
Resumo:
The aim of this paper is to assess the heritability of cerebral cortex, based on measurements of grey matter (GM) thickness derived from structural MR images (sMRI). With data acquired from a large twin cohort (328 subjects), an automated method was used to estimate the cortical thickness, and EM-ICP surface registration algorithm was used to establish the correspondence of cortex across the population. An ACE model was then employed to compute the heritability of cortical thickness. Heritable cortical thickness measures various cortical regions, especially in frontal and parietal lobes, such as bilateral postcentral gyri, superior occipital gyri, superior parietal gyri, precuneus, the orbital part of the right frontal gyrus, right medial superior frontal gyrus, right middle occipital gyrus, right paracentral lobule, left precentral gyrus, and left dorsolateral superior frontal gyrus.
Resumo:
A rapid and sensitive immuno-based screening method was developed to detect domoic acid (DA) present in extracts of shellfish species using a surface plasmon resonance-based optical biosensor. A rabbit polyclonal antibody raised against DA was mixed with standard or sample extracts and allowed to interact with DA immobilized onto a sensor chip surface. The characterization of the antibody strongly suggested high cross-reactivity with DA and important isomers of the toxin. The binding of this antibody to the sensor chip surface was inhibited in the presence of DA in either standard solutions or sample extracts. The DA chip surface proved to be highly stable, achieving approximately 800 analyses per chip without any loss of surface activity. A single analytical cycle (sample injection, chip regeneration, and system wash) took 10 min to complete. Sample analysis (scallops, mussels, cockles, oysters) was achieved by simple extraction with methanol. These extracts were then filtered and diluted before analysis. Detection limits in the ng/g range were achieved by the assay; however, the assay parameters chosen allowed the test to be performed most accurately at the European Union's official action limit for DA of 20 mu g/g. At this concentration, intra- and interassay variations were measured for a range of shellfish species and ranged from 4.5 to 7.4% and 2.3 to 9.7%, respectively.
Resumo:
Biosensors are used for a large number of applications within biotechnology, including the pharmaceutical industry and life sciences. Since the production of Biacore surface-plasmon resonance instruments in the early 1990s, there has been steadily growing use of this technology for the detection of food contaminants (e.g., veterinary drugs, mycotoxins, marine toxins, food dyes and processing contaminants). Other biosensing technologies (e.g., electrochemical and piezoelectric) have also been employed for the analysis of small-molecule contaminants. This review concentrates on recent advances made in detection and quantification of antimicrobial compounds with different types of biosensors and on the emergence of multiplexing, which is highly desirable as it increases sample analysis at lower cost and in less time. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Timely detection of sudden change in dynamics that adversely affect the performance of systems and quality of products has great scientific relevance. This work focuses on effective detection of dynamical changes of real time signals from mechanical as well as biological systems using a fast and robust technique of permutation entropy (PE). The results are used in detecting chatter onset in machine turning and identifying vocal disorders from speech signal.Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. Here we propose the use of permutation entropy (PE), to detect the dynamical changes in two non linear processes, turning under mechanical system and speech under biological system.Effectiveness of PE in detecting the change in dynamics in turning process from the time series generated with samples of audio and current signals is studied. Experiments are carried out on a lathe machine for sudden increase in depth of cut and continuous increase in depth of cut on mild steel work pieces keeping the speed and feed rate constant. The results are applied to detect chatter onset in machining. These results are verified using frequency spectra of the signals and the non linear measure, normalized coarse-grained information rate (NCIR).PE analysis is carried out to investigate the variation in surface texture caused by chatter on the machined work piece. Statistical parameter from the optical grey level intensity histogram of laser speckle pattern recorded using a charge coupled device (CCD) camera is used to generate the time series required for PE analysis. Standard optical roughness parameter is used to confirm the results.Application of PE in identifying the vocal disorders is studied from speech signal recorded using microphone. Here analysis is carried out using speech signals of subjects with different pathological conditions and normal subjects, and the results are used for identifying vocal disorders. Standard linear technique of FFT is used to substantiate thc results.The results of PE analysis in all three cases clearly indicate that this complexity measure is sensitive to change in regularity of a signal and hence can suitably be used for detection of dynamical changes in real world systems. This work establishes the application of the simple, inexpensive and fast algorithm of PE for the benefit of advanced manufacturing process as well as clinical diagnosis in vocal disorders.
Resumo:
We have developed a model of the local field potential (LFP) based on the conservation of charge, the independence principle of ionic flows and the classical Hodgkin–Huxley (HH) type intracellular model of synaptic activity. Insights were gained through the simulation of the HH intracellular model on the nonlinear relationship between the balance of synaptic conductances and that of post-synaptic currents. The latter is dependent not only on the former, but also on the temporal lag between the excitatory and inhibitory conductances, as well as the strength of the afferent signal. The proposed LFP model provides a method for decomposing the LFP recordings near the soma of layer IV pyramidal neurons in the barrel cortex of anaesthetised rats into two highly correlated components with opposite polarity. The temporal dynamics and the proportional balance of the two components are comparable to the excitatory and inhibitory post-synaptic currents computed from the HH model. This suggests that the two components of the LFP reflect the underlying excitatory and inhibitory post-synaptic currents of the local neural population. We further used the model to decompose a sequence of evoked LFP responses under repetitive electrical stimulation (5 Hz) of the whisker pad. We found that as neural responses adapted, the excitatory and inhibitory components also adapted proportionately, while the temporal lag between the onsets of the two components increased during frequency adaptation. Our results demonstrated that the balance between neural excitation and inhibition can be investigated using extracellular recordings. Extension of the model to incorporate multiple compartments should allow more quantitative interpretations of surface Electroencephalography (EEG) recordings into components reflecting the excitatory, inhibitory and passive ionic current flows generated by local neural populations.
Resumo:
About 90% of the anthropogenic increase in heat stored in the climate system is found the oceans. Therefore it is relevant to understand the details of ocean heat uptake. Here we present a detailed, process-based analysis of ocean heat uptake (OHU) processes in HiGEM1.2, an atmosphere-ocean general circulation model (AOGCM) with an eddy-permitting ocean component of 1/3 degree resolution. Similarly to various other models, HiGEM1.2 shows that the global heat budget is dominated by a downward advection of heat compensated by upward isopycnal diffusion. Only in the upper tropical ocean do we find the classical balance between downward diapycnal diffusion and upward advection of heat. The upward isopycnal diffusion of heat is located mostly in the Southern Ocean, which thus dominates the global heat budget. We compare the responses to a 4xCO2 forcing and an enhancement of the windstress forcing in the Southern Ocean. This highlights the importance of regional processes for the global ocean heat uptake. These are mainly surface fluxes and convection in the high latitudes, and advection in the Southern Ocean mid-latitudes. Changes in diffusion are less important. In line with the CMIP5 models, HiGEM1.2 shows a band of strong OHU in the mid-latitude Southern Ocean in the 4xCO2 run, which is mostly advective. By contrast, in the high-latitude Southern Ocean regions it is the suppression of convection that leads to OHU. In the enhanced windstress run, convection is strengthened at high Southern latitudes, leading to heat loss, while the magnitude of the OHU in the Southern mid-latitudes is very similar to the 4xCO2 results. Remarkably, there is only very small global OHU in the enhanced windstress run. The wind stress forcing just leads to a redistribution of heat. We relate the ocean changes at high southern latitudes to the effect of climate change on the Antarctic Circumpolar Current (ACC). It weakens in the 4xCO2 run and strengthens in the wind stress run. The weakening is due to a narrowing of the ACC, caused by an expansion of the Weddell Gyre, and a flattening of the isopycnals, which are explained by a combination of the wind stress forcing and increased precipitation.
Resumo:
A simple protein-DNA interaction analysis has been developed using a high-affinity/high-specificity zinc finger protein. In essence, purified protein samples are immobilized directly onto the surface of microplate wells, and fluorescently labeled DNA is added in solution. After incubation and washing, bound DNA is detected in a standard microplate reader. The minimum sensitivity of the assay is approximately 0.2 nM DNA. Since the detection of bound DNA is noninvasive and the protein-DNA interaction is not disrupted during detection, iterative readings may be taken from the same well, after successive alterations in interaction conditions, if required. In this respect, the assay may therefore be considered real time and permits appropriate interaction conditions to be determined quantitatively. The assay format is ideally suited to investigate the interactions of purified unlabeled DNA binding proteins in a high-throughput format.
Resumo:
A simple protein-DNA interaction analysis has been developed using both a high-affinity/high-specificity zinc finger protein and a low-specificity zinc finger protein with nonspecific DNA binding capability. The latter protein is designed to mimic background binding by proteins generated in randomized or shuffled gene libraries. In essence, DNA is immobilized onto the surface of microplate wells via streptavidin capture, and green fluorescent protein (GFP)-labeled protein is added in solution as part of a crude cell lysate or protein mixture. After incubation and washing, bound protein is detected in a standard microplate reader. The minimum sensitivity of the assay is approximately 0.4 nM protein. The assay format is ideally suited to investigate the interactions of DNA binding proteins from within crude cell extracts and/or mixtures of proteins that may be encountered in protein libraries generated by codon randomization or gene shuffling.
Resumo:
Enterprise Application Integration (EAI) is a challenging area that is attracting growing attention from the software industry and the research community. A landscape of languages and techniques for EAI has emerged and is continuously being enriched with new proposals from different software vendors and coalitions. However, little or no effort has been dedicated to systematically evaluate and compare these languages and techniques. The work reported in this paper is a first step in this direction. It presents an in-depth analysis of a language, namely the Business Modeling Language, specifically developed for EAI. The framework used for this analysis is based on a number of workflow and communication patterns. This framework provides a basis for evaluating the advantages and drawbacks of EAI languages with respect to recurrent problems and situations.
Resumo:
Realistic estimates of short- and long-term (strategic) budgets for maintenance and rehabilitation of road assessment management should consider the stochastic characteristics of asset conditions of the road networks so that the overall variability of road asset data conditions is taken into account. The probability theory has been used for assessing life-cycle costs for bridge infrastructures by Kong and Frangopol (2003), Zayed et.al. (2002), Kong and Frangopol (2003), Liu and Frangopol (2004), Noortwijk and Frangopol (2004), Novick (1993). Salem 2003 cited the importance of the collection and analysis of existing data on total costs for all life-cycle phases of existing infrastructure, including bridges, road etc., and the use of realistic methods for calculating the probable useful life of these infrastructures (Salem et. al. 2003). Zayed et. al. (2002) reported conflicting results in life-cycle cost analysis using deterministic and stochastic methods. Frangopol et. al. 2001 suggested that additional research was required to develop better life-cycle models and tools to quantify risks, and benefits associated with infrastructures. It is evident from the review of the literature that there is very limited information on the methodology that uses the stochastic characteristics of asset condition data for assessing budgets/costs for road maintenance and rehabilitation (Abaza 2002, Salem et. al. 2003, Zhao, et. al. 2004). Due to this limited information in the research literature, this report will describe and summarise the methodologies presented by each publication and also suggest a methodology for the current research project funded under the Cooperative Research Centre for Construction Innovation CRC CI project no 2003-029-C.
Resumo:
Data mining techniques extract repeated and useful patterns from a large data set that in turn are utilized to predict the outcome of future events. The main purpose of the research presented in this paper is to investigate data mining strategies and develop an efficient framework for multi-attribute project information analysis to predict the performance of construction projects. The research team first reviewed existing data mining algorithms, applied them to systematically analyze a large project data set collected by the survey, and finally proposed a data-mining-based decision support framework for project performance prediction. To evaluate the potential of the framework, a case study was conducted using data collected from 139 capital projects and analyzed the relationship between use of information technology and project cost performance. The study results showed that the proposed framework has potential to promote fast, easy to use, interpretable, and accurate project data analysis.
Resumo:
The aim of this study was to explore two of the mechanisms by which transformational leaders have a positive influence on followers. It examined the mediating role of follower’s leader and group identification on the associations among different transformational leader behaviours and follower job satisfaction and supervisor-rated job performance. One hundred and seventy-nine healthcare employees and 44 supervisors participated in the study. The results from multilevel structural equation modelling provided results that partially supported the predicted model. Identification with the leader significantly mediated the positive associations between supportive leadership, intellectual stimulation, personal recognition, in the prediction of job satisfaction and job performance. Leader identification also mediated the relationship between supportive leadership, intellectual stimulation, personal recognition, and group identification. However, group identification did not mediate the associations between vision leadership and inspirational communication, in the prediction of job satisfaction and job performance. The results highlight the role of individualized forms of leadership and leader identification in enhancing follower outcomes.