997 resultados para adaptive markers
Resumo:
It has been suggested that endothelial apoptosis is a primary lesion in the pathogenesis of thrombotic thrombocytopenic purpura (TTP). We tested this hypothesis by examining the phenotypic signatures of endothelial microparticles (EMP) in TTP patients. In addition, the effect of TTP plasma on microvascular endothelial cells (MVEC) in culture was further delineated. EMP released by endothelial cells (EC) express markers of the parent EC; EMP released in activation carry predominantly CD54 and CD62E, while those in apoptosis CD31 and CD105. We investigated EMP release in vitro and in TTP patients. Following incubation of MVEC with TTP plasma, EMP and EC were analysed by flow cytometry for the expression of CD31, CD51, CD54, CD62E, CD105, CD106 and von Willebrand factor (VWF) antigen. EMP were also analysed in 12 TTP patients. In both EC and EMP, CD62E and CD54 expression were increased 3- to 10-fold and 8- to 10-fold respectively. However, CD31 and CD105 were reduced 40-60% in EC but increased twofold in EMP. VWF expression was found in 55 +/- 15% of CD62E(+) EMP. Markers of apoptosis were negative. In TTP patients, CD62E(+) and CD31(+)/CD42b(-) EMP were markedly elevated, and preceded and correlated well with a rise in platelet counts and a fall in lactate dehydrogenase. CD62E(+) EMP (60 +/- 20%) co-expressed VWF and CD62E. The ratio of CD31(+)/42b(-) to CD62E(+) EMP exhibited a pattern consistent with activation. In conclusion, our studies indicate endothelial activation in TTP. EMP that co-express VWF and CD62E could play a role in the pathogenesis of TTP.
Resumo:
As an important part of petroleum exploration areas in the west of China, the north part of Qaidam basin is very promising in making great progress for petroleum discovery. But there are still many obstacles to overcome in understanding the process of petroleum formation and evaluation of oil & gas potential because of the complexity of geological evolution in the study area. Based upon the petroleum system theory, the process of petroleum formation is analyzed and the potential of oil & gas is evaluated in different petroleum systems by means of the modeling approach. The geological background for the formation of petroleum systems and the consisting elements of petroleum systems are described in detail. The thickness of strata eroded is estimated by means of vitrinite reflectance modeling, compaction parameter calculating and thickness extrapolating. The buried histories are reconstructed using the transient compaction model, which combines of forward and reverse modeling. The geo-history evolution consists of four stages - sedimentation in different rates with different areas and slow subsidence during Jurassic, uplifting and erosion during Cretaceous, fast subsidence during the early and middle periods of Tertiary, subsidence and uplifting in alternation during the late period of Tertiary and Quaternary. The thermal gradients in the study area are from 2.0 ℃/100m to 2.6 ℃/100m, and the average of heat flow is 50.6 mW/m~2. From the vitrinite reflectance and apatite fission track data, a new approach based up Adaptive Genetic Algorithms for thermal history reconstruction is presented and used to estimate the plaeo-heat flow. The results of modeling show that the heat flow decreased and the basin got cooler from Jurassic to now. Oil generation from kerogens, gas generation from kerogens and gas cracked from oil are modeled by kinetic models. The kinetic parameters are calculated from the data obtained from laboratory experiments. The evolution of source rock maturation is modeled by means of Easy %Ro method. With the reconstruction of geo-histories and thermal histories and hydrocarbon generation, the oil and gas generation intensities for lower and middle Jurassic source rocks in different time are calculated. The results suggest that the source rocks got into maturation during the time of Xiaganchaigou sedimentation. The oil & gas generation centers for lower Jurassic source rocks locate in Yikeyawuru sag, Kunteyi sag and Eboliang area. The centers of generation for middle Jurassic source rocks locate in Saishenteng faulted sag and Yuka faulted sag. With the evidence of bio-markers and isotopes of carbonates, the oil or gas in Lenghusihao, Lenghuwuhao, Nanbaxian and Mahai oilfields is from lower Jurassic source rocks, and the oil or gas in Yuka is from middle Jurassic source rocks. Based up the results of the modeling, the distribution of source rocks and occurrence of oil and gas, there should be two petroleum systems in the study area. The key moments for these two petroleum, J_1-R(!) and J_2-J_3, are at the stages of Xiaganchaigou-Shangyoushashan sedimentation and Xiayoushashan-Shizigou sedimentation. With the kinetic midels for oil generated from kerogen, gas generated from kerogen and oil cracked to gas, the amount of oil and gas generated at different time in the two petroleum systems is calculated. The cumulative amount of oil generated from kerogen, gas generated from kerogen and gas cracked from oil is 409.78 * 10~8t, 360518.40 * 10~8m~3, and 186.50 * 10~8t in J_1-R(!). The amount of oil and gas generated for accumulation is 223.28 * 10~8t and 606692.99 * 10~8m~3 in J_1-R(!). The cumulative amount of oil generated from kerogen, gas generated from kerogen and gas cracked from oil is 29.05 * 10~8t, 23025.29 * 10~8m~3 and 14.42 * 10~8t in J_2-J_3 (!). The amount of oil and gas generated for accumulation is 14.63 * 10~8t and 42055.44 * 10~8m~3 in J_2-J_3 (!). The total oil and gas potential is 9.52 * 10~8t and 1946.25 * 10~8m~3.
Resumo:
AIM: Fourteen urinary nucleosides, primary degradation products of tRNA, were evaluated to know the potential as biological markers for patients with colorectal cancer.
Resumo:
Background: Chronic hepatitis C (CHC) has emerged as a leading cause of cirrhosis in the U. S. and across the world. To understand the role of apoptotic pathways in hepatitis C virus (HCV) infection, we studied the mRNA and protein expression patterns of apoptosis-related genes in peripheral blood mononuclear cells (PBMC) obtained from patients with HCV infection.Methods: the present study included 50 subjects which plasma samples were positive for HCV, but negative for human immunodeficiency virus (HIV) or hepatitis B virus (HBV). These cases were divided into four groups according to METAVIR, a score-based analysis which helps to interpret a liver biopsy according to the degree of inflammation and fibrosis. mRNA expression of the studied genes were analyzed by reverse transcription of quantitative polymerase chain reaction (RT-qPCR) and protein levels, analyzed by ELISA, was also conducted. HCV genotyping was also determined.Results: HCV infection increased mRNA expression and protein synthesis of caspase 8 in group 1 by 3 fold and 4 fold, respectively (p < 0.05). in group 4 HCV infection increased mRNA expression and protein synthesis of caspase 9 by 2 fold and 1,5 fold, respectively (p < 0.05). Also, caspase 3 mRNA expression and protein synthesis had level augumented by HCV infection in group 1 by 4 fold and 5 fold, respectively, and in group 4 by 6 fold and 7 fold, respectively (p < 0.05).Conclusions: HCV induces alteration at both genomic and protein levels of apoptosis markers involved with extrinsic and intrinsic pathways.
Resumo:
Ethnopharmacological relevance: A common plant used to treat several gastric disorders is Buddleja scordioides Kunth,commonly known as salvilla. Aim of thes tudy: To detect inflammatory markers,in order to evaluate the gastroprotective potential of salvilla infusions,as this could have beneficial impact on the population exposed to gastric ulcers and colitis. Materials and methods: The present work attempted infusions were prepared with B. scordioides (1% w/w) lyophilized and stored.Total phenolic content and GC–MS analysis were performed. Wistar rats were divided into five groups a negative vehicle control,an indomethacin group,and three experimental groups,named preventive,curative,and suppressive. All rats were sacrificed under deep ether anesthesia(6h)after the last oral administration of indomethacin/infusion.The rat stomachs were promptly excised,weighed,and chilled in ice-cold and 0.9%NaCl.Histological analysis,nitrites quantification and immunodetection assays were done. Results: B.scordioides infusions markedly reduced the visible hemorrhagic lesions induced byindomethacin in rat stomachs,also showed down-regulation of COX2, IL-8 and TNFα and up-regulation of COX-1with a moderate down-regulation of NFkB and lower amount of nitrites.However,this behavior was dependent on the treatment,showing most down-regulation of COX-2,TNFα and IL-8 in the curative treatment;more down-regulation of NF-kB in the preventive treatment;and more up-regulation of COX-1 for the suppressor and preventive treatments. Conclusion: The anti-inflammatory potential of B. scordioides infusions could be related with the presence of polyphenols as quercetin in the infusion and how this one is consumed.
Resumo:
Rowland, J.J. and Taylor, J. (2002). Adaptive denoising in spectral analysis by genetic programming. Proc. IEEE Congress on Evolutionary Computation (part of WCCI), May 2002. pp 133-138. ISBN 0-7803-7281-6
Resumo:
Walker,J. and Wilson,M.S., 'Lifelong Evolution for Adaptive Robots', Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2002, October, pp 984--989
Resumo:
Ioan Fazey, John A. Fazey, Joern Fischer, Kate Sherren, John Warren, Reed F. Noss, Stephen R. Dovers (2007) Adaptive capacity and learning to learn as leverage for social?ecological resilience. Frontiers in Ecology and the Environment 5(7),375-380. RAE2008
Resumo:
Real-time adaptive music is now well-established as a popular medium, largely through its use in video game soundtracks. Commercial packages, such as fmod, make freely available the underlying technical methods for use in educational contexts, making adaptive music technologies accessible to students. Writing adaptive music, however, presents a significant learning challenge, not least because it requires a different mode of thought, and tutor and learner may have few mutual points of connection in discovering and understanding the musical drivers, relationships and structures in these works. This article discusses the creation of ‘BitBox!’, a gestural music interface designed to deconstruct and explain the component elements of adaptive composition through interactive play. The interface was displayed at the Dare Protoplay games exposition in Dundee in August 2014. The initial proof-of- concept study proved successful, suggesting possible refinements in design and a broader range of applications.
Resumo:
BACKGROUND:Recent advances in genome sequencing suggest a remarkable conservation in gene content of mammalian organisms. The similarity in gene repertoire present in different organisms has increased interest in studying regulatory mechanisms of gene expression aimed at elucidating the differences in phenotypes. In particular, a proximal promoter region contains a large number of regulatory elements that control the expression of its downstream gene. Although many studies have focused on identification of these elements, a broader picture on the complexity of transcriptional regulation of different biological processes has not been addressed in mammals. The regulatory complexity may strongly correlate with gene function, as different evolutionary forces must act on the regulatory systems under different biological conditions. We investigate this hypothesis by comparing the conservation of promoters upstream of genes classified in different functional categories.RESULTS:By conducting a rank correlation analysis between functional annotation and upstream sequence alignment scores obtained by human-mouse and human-dog comparison, we found a significantly greater conservation of the upstream sequence of genes involved in development, cell communication, neural functions and signaling processes than those involved in more basic processes shared with unicellular organisms such as metabolism and ribosomal function. This observation persists after controlling for G+C content. Considering conservation as a functional signature, we hypothesize a higher density of cis-regulatory elements upstream of genes participating in complex and adaptive processes.CONCLUSION:We identified a class of functions that are associated with either high or low promoter conservation in mammals. We detected a significant tendency that points to complex and adaptive processes were associated with higher promoter conservation, despite the fact that they have emerged relatively recently during evolution. We described and contrasted several hypotheses that provide a deeper insight into how transcriptional complexity might have been emerged during evolution.
Resumo:
Current research on Internet-based distributed systems emphasizes the scalability of overlay topologies for efficient search and retrieval of data items, as well as routing amongst peers. However, most existing approaches fail to address the transport of data across these logical networks in accordance with quality of service (QoS) constraints. Consequently, this paper investigates the use of scalable overlay topologies for routing real-time media streams between publishers and potentially many thousands of subscribers. Specifically, we analyze the costs of using k-ary n-cubes for QoS-constrained routing. Given a number of nodes in a distributed system, we calculate the optimal k-ary n-cube structure for minimizing the average distance between any pair of nodes. Using this structure, we describe a greedy algorithm that selects paths between nodes in accordance with the real-time delays along physical links. We show this method improves the routing latencies by as much as 67%, compared to approaches that do not consider physical link costs. We are in the process of developing a method for adaptive node placement in the overlay topology, based upon the locations of publishers, subscribers, physical link costs and per-subscriber QoS constraints. One such method for repositioning nodes in logical space is discussed, to improve the likelihood of meeting service requirements on data routed between publishers and subscribers. Future work will evaluate the benefits of such techniques more thoroughly.
Resumo:
Overlay networks have emerged as a powerful and highly flexible method for delivering content. We study how to optimize throughput of large, multipoint transfers across richly connected overlay networks, focusing on the question of what to put in each transmitted packet. We first make the case for transmitting encoded content in this scenario, arguing for the digital fountain approach which enables end-hosts to efficiently restitute the original content of size n from a subset of any n symbols from a large universe of encoded symbols. Such an approach affords reliability and a substantial degree of application-level flexibility, as it seamlessly tolerates packet loss, connection migration, and parallel transfers. However, since the sets of symbols acquired by peers are likely to overlap substantially, care must be taken to enable them to collaborate effectively. We provide a collection of useful algorithmic tools for efficient estimation, summarization, and approximate reconciliation of sets of symbols between pairs of collaborating peers, all of which keep messaging complexity and computation to a minimum. Through simulations and experiments on a prototype implementation, we demonstrate the performance benefits of our informed content delivery mechanisms and how they complement existing overlay network architectures.
Resumo:
Attributing a dollar value to a keyword is an essential part of running any profitable search engine advertising campaign. When an advertiser has complete control over the interaction with and monetization of each user arriving on a given keyword, the value of that term can be accurately tracked. However, in many instances, the advertiser may monetize arrivals indirectly through one or more third parties. In such cases, it is typical for the third party to provide only coarse-grained reporting: rather than report each monetization event, users are aggregated into larger channels and the third party reports aggregate information such as total daily revenue for each channel. Examples of third parties that use channels include Amazon and Google AdSense. In such scenarios, the number of channels is generally much smaller than the number of keywords whose value per click (VPC) we wish to learn. However, the advertiser has flexibility as to how to assign keywords to channels over time. We introduce the channelization problem: how do we adaptively assign keywords to channels over the course of multiple days to quickly obtain accurate VPC estimates of all keywords? We relate this problem to classical results in weighing design, devise new adaptive algorithms for this problem, and quantify the performance of these algorithms experimentally. Our results demonstrate that adaptive weighing designs that exploit statistics of term frequency, variability in VPCs across keywords, and flexible channel assignments over time provide the best estimators of keyword VPCs.
Resumo:
An increasing number of applications, such as distributed interactive simulation, live auctions, distributed games and collaborative systems, require the network to provide a reliable multicast service. This service enables one sender to reliably transmit data to multiple receivers. Reliability is traditionally achieved by having receivers send negative acknowledgments (NACKs) to request from the sender the retransmission of lost (or missing) data packets. However, this Automatic Repeat reQuest (ARQ) approach results in the well-known NACK implosion problem at the sender. Many reliable multicast protocols have been recently proposed to reduce NACK implosion. But, the message overhead due to NACK requests remains significant. Another approach, based on Forward Error Correction (FEC), requires the sender to encode additional redundant information so that a receiver can independently recover from losses. However, due to the lack of feedback from receivers, it is impossible for the sender to determine how much redundancy is needed. In this paper, we propose a new reliable multicast protocol, called ARM for Adaptive Reliable Multicast. Our protocol integrates ARQ and FEC techniques. The objectives of ARM are (1) reduce the message overhead due to NACK requests, (2) reduce the amount of data transmission, and (3) reduce the time it takes for all receivers to receive the data intact (without loss). During data transmission, the sender periodically informs the receivers of the number of packets that are yet to be transmitted. Based on this information, each receiver predicts whether this amount is enough to recover its losses. Only if it is not enough, that the receiver requests the sender to encode additional redundant packets. Using ns simulations, we show the superiority of our hybrid ARQ-FEC protocol over the well-known Scalable Reliable Multicast (SRM) protocol.
Resumo:
SomeCast is a novel paradigm for the reliable multicast of real-time data to a large set of receivers over the Internet. SomeCast is receiver-initiated and thus scalable in the number of receivers, the diverse characteristics of paths between senders and receivers (e.g. maximum bandwidth and round-trip-time), and the dynamic conditions of such paths (e.g. congestion-induced delays and losses). SomeCast enables receivers to dynamically adjust the rate at which they receive multicast information to enable the satisfaction of real-time QoS constraints (e.g. rate, deadlines, or jitter). This is done by enabling a receiver to join SOME number of concurrent multiCAST sessions, whereby each session delivers a portion of an encoding of the real-time data. By adjusting the number of such sessions dynamically, client-specific QoS constraints can be met independently. The SomeCast paradigm can be thought of as a generalization of the AnyCast (e.g. Dynamic Server Selection) and ManyCast (e.g. Digital Fountain) paradigms, which have been proposed in the literature to address issues of scalability of UniCast and MultiCast environments, respectively. In this paper we overview the SomeCast paradigm, describe an instance of a SomeCast protocol, and present simulation results that quantify the significant advantages gained from adopting such a protocol for the reliable multicast of data to a diverse set of receivers subject to real-time QoS constraints.