957 resultados para capabilities


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Following a high wind event on January 24, 2006, at least five people claimed to have seen or felt the superstructure of the Saylorville Reservoir Bridge in central Iowa moving both vertically and laterally. Since that time, the Iowa Department of Transportation (DOT) contracted with the Bridge Engineering Center at Iowa State University to design and install a monitoring system capable of providing notification of the occurrence of subsequent high wind events. In subsequent years, a similar system was installed on the Red Rock Reservoir Bridge to provide the same wind monitoring capabilities and notifications to the Iowa DOT. The objectives of the system development and implementation are to notify personnel when the wind speed reaches a predetermined threshold such that the bridge can be closed for the safety of the public, correlate structural response with wind-induced response, and gather historical wind data at these structures for future assessments. This report describes the two monitoring systems, their components, upgrades, functionality, and limitations, and results from one year of wind data collection at both bridges.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Transposable elements (TEs) constitute a substantial amount of all eukaryotic genomes. They induce an important proportion of deleterious mutations by insertion into genes or gene regulatory regions. However, their mutational capabilities are not always adverse but can contribute to the genetic diversity and evolution of organisms. Knowledge of their distribution and activity in the genomes of populations under different environmental and demographic regimes, is important to understand their role in species evolution. In this work we study the chromosomaldistribution of two TEs, gypsy and bilbo, in original and colonizing populations of Drosophilasubobscura to reveal the putative effect of colonization on their insertion profile.Results: Chromosomal frequency distribution of two TEs in one original and three colonizingpopulations of D. subobscura, is different. Whereas the original population shows a low insertionfrequency in most TE sites, colonizing populations have a mixture of high (frequency ¿ 10%) andlow insertion sites for both TEs. Most highly occupied sites are coincident among colonizingpopulations and some of them are correlated to chromosomal arrangements. Comparisons of TEcopy number between the X chromosome and autosomes show that gypsy occupancy seems to becontrolled by negative selection, but bilbo one does not. Conclusion: These results are in accordance that TEs in Drosophila subobscura colonizing populations are submitted to a founder effect followed by genetic drift as a consequence of colonization. This would explain the high insertion frequencies of bilbo and gypsy in coincident sites of colonizing populations. High occupancy sites would represent insertion events prior to colonization. Sites of low frequency would be insertions that occurred after colonization and/orcopies from the original population whose frequency is decreasing in colonizing populations. Thiswork is a pioneer attempt to explain the chromosomal distribution of TEs in a colonizing specieswith high inversion polymorphism to reveal the putative effect of arrangements in TE insertionprofiles. In general no associations between arrangements and TE have been found, except in a fewcases where the association is very strong. Alternatively, founder drift effects, seem to play aleading role in TE genome distribution in colonizing populations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Freshwater planarians are an attractive model for regeneration and stem cell research and have become a promising tool in the field of regenerative medicine. With the availability of a sequenced planarian genome, the recent application of modern genetic and high-throughput tools has resulted in revitalized interest in these animals, long known for their amazing regenerative capabilities, which enable them to regrow even a new head after decapitation. However, a detailed description of the planarian transcriptome is essential for future investigation into regenerative processes using planarians as a model system. Results: In order to complement and improve existing gene annotations, we used a 454 pyrosequencing approach to analyze the transcriptome of the planarian species Schmidtea mediterranea Altogether, 598,435 454-sequencing reads, with an average length of 327 bp, were assembled together with the ~10,000 sequences of the S. mediterranea UniGene set using different similarity cutoffs. The assembly was then mapped onto the current genome data. Remarkably, our Smed454 dataset contains more than 3 million novel transcribed nucleotides sequenced for the first time. A descriptive analysis of planarian splice sites was conducted on those Smed454 contigs that mapped univocally to the current genome assembly. Sequence analysis allowed us to identify genes encoding putative proteins with defined structural properties, such as transmembrane domains. Moreover, we annotated the Smed454 dataset using Gene Ontology, and identified putative homologues of several gene families that may play a key role during regeneration, such as neurotransmitter and hormone receptors, homeobox-containing genes, and genes related to eye function. Conclusions: We report the first planarian transcript dataset, Smed454, as an open resource tool that can be accessed via a web interface. Smed454 contains significant novel sequence information about most expressed genes of S. mediterranea. Analysis of the annotated data promises to contribute to identification of gene families poorly characterized at a functional level. The Smed454 transcriptome data will assist in the molecular characterization of S. mediterranea as a model organism, which will be useful to a broad scientific community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Freshwater planarians are an attractive model for regeneration and stem cell research and have become a promising tool in the field of regenerative medicine. With the availability of a sequenced planarian genome, the recent application of modern genetic and high-throughput tools has resulted in revitalized interest in these animals, long known for their amazing regenerative capabilities, which enable them to regrow even a new head after decapitation. However, a detailed description of the planarian transcriptome is essential for future investigation into regenerative processes using planarians as a model system. Results In order to complement and improve existing gene annotations, we used a 454 pyrosequencing approach to analyze the transcriptome of the planarian species Schmidtea mediterranea Altogether, 598,435 454-sequencing reads, with an average length of 327 bp, were assembled together with the ~10,000 sequences of the S. mediterranea UniGene set using different similarity cutoffs. The assembly was then mapped onto the current genome data. Remarkably, our Smed454 dataset contains more than 3 million novel transcribed nucleotides sequenced for the first time. A descriptive analysis of planarian splice sites was conducted on those Smed454 contigs that mapped univocally to the current genome assembly. Sequence analysis allowed us to identify genes encoding putative proteins with defined structural properties, such as transmembrane domains. Moreover, we annotated the Smed454 dataset using Gene Ontology, and identified putative homologues of several gene families that may play a key role during regeneration, such as neurotransmitter and hormone receptors, homeobox-containing genes, and genes related to eye function. Conclusions We report the first planarian transcript dataset, Smed454, as an open resource tool that can be accessed via a web interface. Smed454 contains significant novel sequence information about most expressed genes of S. mediterranea. Analysis of the annotated data promises to contribute to identification of gene families poorly characterized at a functional level. The Smed454 transcriptome data will assist in the molecular characterization of S. mediterranea as a model organism, which will be useful to a broad scientific community.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we present the cultural evolution that has allowed to overcome many problems derived from the limitations of the human body. These limitations have been solved by a"cyborization" process that began since early anthropogenesis. Originally, it was envisioned to deal with some diseases, accidents or body malfunctions. Nowadays, augmentations improve common human capabilities; one of the most notable is the increase of brain efficiency by using connections with a computer. A basic social question also addressed is which people will and should have access to these augmentations. Advanced humanoid robots (with human external aspect, artificial intelligence and even emotions) already exist and consequently a number of questions arise. For instance, will robots be considered living organisms? Could they be considered as persons? Will we confer the human status to robots? These questions are discussed. Our conclusions are that the advanced humanoid robots display some actions that may be considered as life-like, yet different to the life associated with living organisms, also, to some extend they could be considered as persons-like, but not humans.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Calcium magnesium acetate (CMA) has been identified by Bjorksten Research Laboratories as an environmentally harmless alternative to sodium or calcium chloride for deicing highways. Their study found CMA to be noncorrosive to steel, aluminum and zinc with little or no anticipated environmental impact. When used, it degrades into elements found in abundance in nature. The deicing capabilities were found to be similar to sodium chloride. The neutralized CMA they produced did cause scaling of PC concrete, but they did not expect mildly alkaline CMA to have this effect. In the initial investigation of CMA at the Iowa DOT laboratory, it was found that CMA produced from hydrated lime and acetic acid was a light, fluffy material. It was recognized that a deicer in this form would be difficult to effectively distribute on highways without considerable wind loss. A process was developed to produce CMA in the presence of sand to increase particle weight. In this report the product of this process, which consists of sand particles coated with CMA, is referred to as "CMA deicer". The mixture of salts, calcium magnesium acetate, is referred to as "CMA". The major problems with CMA for deicing are: (1) it is not commercially available, (2) it is expensive with present production methods and (3) there is very little known about how it performs on highways under actual deicing conditions. In view of the potential benefits this material offers, it is highly desirable to find solutions or answers to these problems. This study provides information to advance that effort.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The primary objective of this research was to demonstrate the benefits of NDT technologies for effectively detecting and characterizing deterioration in bridge decks. In particular, the objectives were to demonstrate the capabilities of ground-penetrating radar (GPR) and impact echo (IE), and to evaluate and describe the condition of nine bridge decks proposed by Iowa DOT. The first part of the report provides a detailed review of the most important deterioration processes in concrete decks, followed by a discussion of the five NDT technologies utilized in this project. In addition to GPR and IE methods, three other technologies were utilized, namely: half-cell (HC) potential, electrical resistivity (ER), and ultrasonic surface waves (USW) method. The review includes a description of the principles of operation, field implementation, data analysis, and interpretation; information regarding their advantages and limitations in bridge deck evaluations and condition monitoring are also implicitly provided.. The second part of the report provides descriptions and bridge deck evaluation results from the nine bridges. The results of the NDT surveys are described in terms of condition assessment maps and are compared with the observations obtained from the recovered cores or conducted bridge deck rehabilitation. Results from this study confirm that the used technologies can provide detailed and accurate information about a certain type of deterioration, electrochemical environment, or defect. However, they also show that a comprehensive condition assessment of bridge decks can be achieved only through a complementary use of multiple technologies at this stage,. Recommendations are provided for the optimum implementation of NDT technologies for the condition assessment and monitoring of bridge decks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organisations in Multi-Agent Systems (MAS) have proven to be successful in regulating agent societies. Nevertheless, changes in agents' behaviour or in the dynamics of the environment may lead to a poor fulfilment of the system's purposes, and so the entire organisation needs to be adapted. In this paper we focus on endowing the organisation with adaptation capabilities, instead of expecting agents to be capable of adapting the organisation by themselves. We regard this organisational adaptation as an assisting service provided by what we call the Assistance Layer. Our generic Two Level Assisted MAS Architecture (2-LAMA) incorporates such a layer. We empirically evaluate this approach by means of an agent-based simulator we have developed for the P2P sharing network domain. This simulator implements 2-LAMA architecture and supports the comparison between different adaptation methods, as well as, with the standard BitTorrent protocol. In particular, we present two alternatives to perform norm adaptation and one method to adapt agents'relationships. The results show improved performance and demonstrate that the cost of introducing an additional layer in charge of the system's adaptation is lower than its benefits.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

After birth, the body shifts from glucose as primary energy substrate to milk-derived fats, with sugars from lactose taking a secondary place. At weaning, glucose recovers its primogeniture and dietary fat role decreases. In spite of human temporary adaptation to a high-fat (and sugars and protein) diet during lactation, the ability to thrive on this type of diet is lost irreversibly after weaning. We could not revert too the lactating period metabolic setting because of different proportions of brain/muscle metabolism in the total energy budget, lower thermogenesis needs and capabilities, and absence of significant growth in adults. A key reason for change was the limited availability of foods with high energy content at weaning and during the whole adult life of our ancestors, which physiological adaptations remain practically unchanged in our present-day bodies. Humans have evolved to survive with relatively poor diets interspersed by bouts of scarcity and abundance. Today diets in many societies are largely made up from choice foods, responding to our deeply ingrained desire for fats, protein, sugars, salt etc. Consequently our diets are not well adjusted to our physiological needs/adaptations but mainly to our tastes (another adaptation to periodic scarcity), and thus are rich in energy roughly comparable to milk. However, most adult humans cannot process the food ingested in excess because our cortical-derived craving overrides the mechanisms controlling appetite. This is produced not because we lack the biochemical mechanisms to use this energy, but because we are unprepared for excess, and wholly adapted to survive scarcity. The thrifty mechanisms compound the effects of excess nutrients and damage the control of energy metabolism, developing a pathologic state. As a consequence, an overflow of energy is generated and the disease of plenty develops.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Network reconstructions at the cell level are a major development in Systems Biology. However, we are far from fully exploiting its potentialities. Often, the incremental complexity of the pursued systems overrides experimental capabilities, or increasingly sophisticated protocols are underutilized to merely refine confidence levels of already established interactions. For metabolic networks, the currently employed confidence scoring system rates reactions discretely according to nested categories of experimental evidence or model-based likelihood. Results: Here, we propose a complementary network-based scoring system that exploits the statistical regularities of a metabolic network as a bipartite graph. As an illustration, we apply it to the metabolism of Escherichia coli. The model is adjusted to the observations to derive connection probabilities between individual metabolite-reaction pairs and, after validation, to assess the reliability of each reaction in probabilistic terms. This network-based scoring system uncovers very specific reactions that could be functionally or evolutionary important, identifies prominent experimental targets, and enables further confirmation of modeling results. Conclusions: We foresee a wide range of potential applications at different sub-cellular or supra-cellular levels of biological interactions given the natural bipartivity of many biological networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.