993 resultados para adaptive technologies
Resumo:
Huazhong Univ Sci & Technol, Natl Tech Univ Ukraine, Huazhong Normal Univ, Harbin Inst Technol, IEEE Ukraine Sect, I& M/CI Joint Chapter
Resumo:
We report a novel label-free method for the investigation of the adaptive recognition of small molecules by nucleic acid aptamers using capillary electrophoresis analysis. Cocaine and argininamide were chosen as model molecules, and the two corresponding DNA aptamers were used. These single-strand DNAs folded into their specific secondary structures, which were mainly responsible for the binding of the target molecules with high affinity and specificity. For molecular recognition, the nucleic acid structures then underwent additional conformational changes, while keeping the target molecules stabilized by intermolecular hydrogen bonds. The intrinsic chemical and physical properties of the target molecules enabled them to act as indicators for adaptive binding. Thus any labeling or modification of the aptamers or target molecules were made obsolete. This label-free method for aptamer-based molecular recognition was also successfully applied to biological fluids and therefore indicates that this approach is a promising tool for bioanalysis.
Resumo:
Patterned self-adaptive PS/P2VP mixed polymer brushes were prepared by "grafting to" approach combining with microcontact printing (muCP). The properties of the patterned surface were investigated by lateral force microscopy (LFM), XPS and water condensation figures. In the domains with grafted P2VP, the PS/P2VP mixed brushes demonstrated reversible switching behavior upon exposure to selective solvents for different components. The chemical composition of the top layer as well as the surface wettability can be well tuned due to the perpendicular phase segregation in the mixed brushes. While in the domains without grafted P2VP, the grafted PS did not have the capability of switching. The development and erasing of the pattern is reversible under different solvent treatment.
Resumo:
Fractured oil and gas reservoir is an important type of oil and gas reservoir, which is taking a growing part of current oil and gas production in the whole world. Thus these technologies targeted at exploration of fractured oil and gas reservoirs are drawing vast attentions. It is difficult to accurately predict the fracture development orientation and intensity in oil and gas exploration. Focused on this problem, this paper systematically conducted series study of seismic data processing and P-wave attributes fracture detection based on the structure of ZX buried mountain, and obtained good results. This paper firstly stimulated the propagation of P-wave in weak anisotropic media caused by vertical aligned cracks, and analyzed the rule of P-wave attributes’ variation associated with observed azimuth, such as travel-time, amplitude and AVO gradient and so on, and quantitatively described the sensitive degree of these attributes to anisotropy of fracture medium. In order to further study the sensitive degree of these attributes to anisotropy of fractures, meanwhile, this paper stimulated P-wave propagation through different types and different intensity anisotropic medium respectively and summarized the rule of these attributes’ variation associated with observed azimuth in different anisotropic medium. The results of these studies provided reliable references for predicting orientation, extensity and size of actual complicated cracked medium by P-wave azimuth attributes responses. In the paper, amounts of seismic data processing methods are used to keep and recover all kinds of attributes applied for fracture detection, which guarantee the high accurate of these attributes, thus then improve the accurate of fracture detection. During seismic data processing, the paper adopted the three dimensional F-Kx-Ky field cone filter technique to attenuate ground roll waves and multiple waves, then enhances the S/N ratio of pre-stack seismic data; comprehensively applying geometrical spread compensation, surface consistent amplitude compensation, residual amplitude compensation to recover amplitude; common azimuth processing method effectively preserves the azimuthal characteristics of P-wave attributes; the technique of bend ray adaptive aperture pre-stack time migration insures to obtain the best image in each azimuth. Application of these processing methods guaranteed these attributes’ accuracy, and then improved the accuracy of fracture detection. After comparing and analyzing a variety of attributes, relative wave impedance (relative amplitude) attribute is selected to inverse the orientation of fracture medium; attenuation gradient and corresponding frequency of 85% energy are selected to inverse the intensity of fracture medium; then obtained the fracture distribution characteristics of lower Paleozoic and Precambrian in ZX ancient buried mountains. The results are good accord with the characteristics of faults system and well information in this area.
Resumo:
Rowland, J.J. and Taylor, J. (2002). Adaptive denoising in spectral analysis by genetic programming. Proc. IEEE Congress on Evolutionary Computation (part of WCCI), May 2002. pp 133-138. ISBN 0-7803-7281-6
Resumo:
Walker,J. and Wilson,M.S., 'Lifelong Evolution for Adaptive Robots', Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2002, October, pp 984--989
Resumo:
Norris, G. & Wilson, P., 'Crime Prevention and New Technologies: The Special Case of CCTV', In: Issues in Australian Crime and Criminal Justice, Lexis-Nexis, pp.409-418, 2005. RAE2008
Resumo:
Ioan Fazey, John A. Fazey, Joern Fischer, Kate Sherren, John Warren, Reed F. Noss, Stephen R. Dovers (2007) Adaptive capacity and learning to learn as leverage for social?ecological resilience. Frontiers in Ecology and the Environment 5(7),375-380. RAE2008
Resumo:
Recenzje i sprawozdania z książek
Resumo:
uma revisão sobre alguns estudos epidemiológicos melhorará a compreensão quanto aos efeitos potenciais na saúde da gestão de resíduos e permitirá obter informação importante em termos de trabalho futuro. Alguns estudos mostraram associações significativas entre diversos métodos de gestão de resíduos e impactos potenciais na saúde humana. noutros estudos as associações foram consideradas inconsistentes ou susceptíveis de conduzir a equívocos, sendo necessários mais estudos epidemiológicos para averiguar as consequências para a saúde humana e para determinar os seus efeitos toxicológicos directos, assegurando assim que a gestão de resíduos representa um risco mínimo para a saúde. os estudos epidemiológicos devem ser analisados com uma mente aberta, tomando em consideração factores como o estatuto social e a migração das populações. A review of a few epidemiologic studies will improve the understanding of the potential health effects of waste management and will provide important information regarding future work. several studies showed significant relationships between several methods of waste management and potential impacts on human health. in other studies associations were found to be inconsistent or equivocal and more specific epidemiological studies must be performed to assess consequences to human health and to determine their direct toxicological effects, thus ensuring that waste management pose minimum risk to health.
Resumo:
BACKGROUND:Recent advances in genome sequencing suggest a remarkable conservation in gene content of mammalian organisms. The similarity in gene repertoire present in different organisms has increased interest in studying regulatory mechanisms of gene expression aimed at elucidating the differences in phenotypes. In particular, a proximal promoter region contains a large number of regulatory elements that control the expression of its downstream gene. Although many studies have focused on identification of these elements, a broader picture on the complexity of transcriptional regulation of different biological processes has not been addressed in mammals. The regulatory complexity may strongly correlate with gene function, as different evolutionary forces must act on the regulatory systems under different biological conditions. We investigate this hypothesis by comparing the conservation of promoters upstream of genes classified in different functional categories.RESULTS:By conducting a rank correlation analysis between functional annotation and upstream sequence alignment scores obtained by human-mouse and human-dog comparison, we found a significantly greater conservation of the upstream sequence of genes involved in development, cell communication, neural functions and signaling processes than those involved in more basic processes shared with unicellular organisms such as metabolism and ribosomal function. This observation persists after controlling for G+C content. Considering conservation as a functional signature, we hypothesize a higher density of cis-regulatory elements upstream of genes participating in complex and adaptive processes.CONCLUSION:We identified a class of functions that are associated with either high or low promoter conservation in mammals. We detected a significant tendency that points to complex and adaptive processes were associated with higher promoter conservation, despite the fact that they have emerged relatively recently during evolution. We described and contrasted several hypotheses that provide a deeper insight into how transcriptional complexity might have been emerged during evolution.
Resumo:
Current research on Internet-based distributed systems emphasizes the scalability of overlay topologies for efficient search and retrieval of data items, as well as routing amongst peers. However, most existing approaches fail to address the transport of data across these logical networks in accordance with quality of service (QoS) constraints. Consequently, this paper investigates the use of scalable overlay topologies for routing real-time media streams between publishers and potentially many thousands of subscribers. Specifically, we analyze the costs of using k-ary n-cubes for QoS-constrained routing. Given a number of nodes in a distributed system, we calculate the optimal k-ary n-cube structure for minimizing the average distance between any pair of nodes. Using this structure, we describe a greedy algorithm that selects paths between nodes in accordance with the real-time delays along physical links. We show this method improves the routing latencies by as much as 67%, compared to approaches that do not consider physical link costs. We are in the process of developing a method for adaptive node placement in the overlay topology, based upon the locations of publishers, subscribers, physical link costs and per-subscriber QoS constraints. One such method for repositioning nodes in logical space is discussed, to improve the likelihood of meeting service requirements on data routed between publishers and subscribers. Future work will evaluate the benefits of such techniques more thoroughly.
Resumo:
Overlay networks have emerged as a powerful and highly flexible method for delivering content. We study how to optimize throughput of large, multipoint transfers across richly connected overlay networks, focusing on the question of what to put in each transmitted packet. We first make the case for transmitting encoded content in this scenario, arguing for the digital fountain approach which enables end-hosts to efficiently restitute the original content of size n from a subset of any n symbols from a large universe of encoded symbols. Such an approach affords reliability and a substantial degree of application-level flexibility, as it seamlessly tolerates packet loss, connection migration, and parallel transfers. However, since the sets of symbols acquired by peers are likely to overlap substantially, care must be taken to enable them to collaborate effectively. We provide a collection of useful algorithmic tools for efficient estimation, summarization, and approximate reconciliation of sets of symbols between pairs of collaborating peers, all of which keep messaging complexity and computation to a minimum. Through simulations and experiments on a prototype implementation, we demonstrate the performance benefits of our informed content delivery mechanisms and how they complement existing overlay network architectures.