944 resultados para Very long path length
Resumo:
Alle bisher untersuchten Lebewesen besitzen (circadiane) innere Uhren, die eine endogene Perioden-länge von ungefähr 24 Stunden generieren. Eine innere Uhr kann über Zeitgeber mit der Umwelt synchronisiert werden und ermöglicht dem Organismus, rhythmische Umweltveränderungen vorweg zu nehmen. Neben einem zentralen Schrittmacher, der Physiologie und Verhalten des Organismus steuert, gibt es in unterschiedlichen Organen auch periphere Uhren, die die zeitlichen Abläufe in der spezifischen Funktion dieser Organe steuern. In dieser Arbeit sollten zentrale und periphere Schrittmacherneurone von Insekten physiologisch untersucht und verglichen werden. Die Neurone der akzessorischen Medulla (AME) von Rhyparobia maderae dienten als Modellsystem für zentrale Schrittmacher, während olfaktorische Rezeptorneurone (ORNs) von Manduca sexta als Modellsystem für periphere Schrittmacher dienten. Die zentralen Schrittmacherneurone wurden in extrazellulären Ableitungen an der isolierten AME (Netzwerkebene) und in Patch-Clamp Experimenten an primären AME Zellkulturen (Einzelzellebene) untersucht. Auf Netzwerkebene zeigten sich zwei charakteristische Aktivitätsmuster: regelmäßige Aktivität und Wechsel zwischen hoher und niedriger Aktivität (Oszillationen). Es wurde gezeigt, dass Glutamat ein Neurotransmitter der weitverbreiteten inhibitorischen Synapsen der AME ist, und dass in geringem Maße auch exzitatorische Synapsen vorkommen. Das Neuropeptid pigment-dispersing factor (PDF), das von nur wenigen AME Neuronen exprimiert wird und ein wichtiger Kopplungsfaktor im circadianen System ist, führte zu Hemmungen, Aktivierungen oder Oszillationen. Die Effekte waren transient oder langanhaltend und wurden wahrscheinlich durch den sekundären Botenstoff cAMP vermittelt. Ein Zielmolekül von cAMP war vermutlich exchange protein directly activated by cAMP (EPAC). Auf Einzelzellebene wurde gezeigt, dass die meisten AME Neurone depolarisiert waren und deshalb nicht feuerten. Die Analyse von Strom-Spannungs-Kennlinien und pharmakologische Experimente ergaben, dass unterschiedliche Ionenkanäle vorhanden waren (Ca2+, Cl-, K+, Na+ Kanäle sowie nicht-spezifische Kationenkanäle). Starke, bei hohen Spannungen aktivierende Ca2+ Ströme (ICa) könnten eine wichtige Rolle bei Ca2+-abhängiger Neurotransmitter-Ausschüttung, Oszillationen, und Aktionspotentialen spielen. PDF hemmte unterschiedliche Ströme (ICa, IK und INa) und aktivierte nicht-spezifische Kationenströme (Ih). Es wurde angenommen, dass simultane PDF-abhängige Hyper- und Depolarisationen rhythmische Membranpotential-Oszillationen verursachen. Dieser Mechanismus könnte eine Rolle bei PDF-abhängigen Synchronisationen spielen. Die Analyse peripherer Schrittmacherneurone konzentrierte sich auf die Charakterisierung des olfaktorischen Corezeptors von M. sexta (MsexORCO). In anderen Insekten ist ORCO für die Membran-Insertion von olfaktorischen Rezeptoren (ORs) erforderlich. ORCO bildet Komplexe mit den ORs, die in heterologen Expressionssystemen als Ionenkanäle fungieren und Duft-Antworten vermitteln. Es wurde die Hypothese aufgestellt, dass MsexORCO in pheromonsensitiven ORNs in vivo nicht als Teil eines ionotropen Rezeptors sondern als Schrittmacherkanal fungiert, der unterschwellige Membranpotential-Oszillationen generiert. MsexORCO wurde mit vermeintlichen Pheromonrezeptoren in human embryonic kidney (HEK 293) Zellen coexprimiert. Immuncytochemie und Ca2+ Imaging Experimente zeigten sehr schwache Expressionsraten. Trotzdem war es möglich zu zeigen, dass MsexORCO wahrscheinlich ein spontan-aktiver, Ca2+-permeabler Ionenkanal ist, der durch den ORCO-Agonisten VUAA1 und cyclische Nucleotide aktiviert wird. Außerdem wiesen die Experimente darauf hin, dass MsexOR-1 offensichtlich der Bombykal-Rezeptor ist. Eine weitere Charakterisierung von MsexORCO in primären M. sexta ORN Zellkulturen konnte nicht vollendet werden, weil die ORNs nicht signifikant auf ORCO-Agonisten oder -Antagonisten reagierten.
Resumo:
As the number of processors in distributed-memory multiprocessors grows, efficiently supporting a shared-memory programming model becomes difficult. We have designed the Protocol for Hierarchical Directories (PHD) to allow shared-memory support for systems containing massive numbers of processors. PHD eliminates bandwidth problems by using a scalable network, decreases hot-spots by not relying on a single point to distribute blocks, and uses a scalable amount of space for its directories. PHD provides a shared-memory model by synthesizing a global shared memory from the local memories of processors. PHD supports sequentially consistent read, write, and test- and-set operations. This thesis also introduces a method of describing locality for hierarchical protocols and employs this method in the derivation of an abstract model of the protocol behavior. An embedded model, based on the work of Johnson[ISCA19], describes the protocol behavior when mapped to a k-ary n-cube. The thesis uses these two models to study the average height in the hierarchy that operations reach, the longest path messages travel, the number of messages that operations generate, the inter-transaction issue time, and the protocol overhead for different locality parameters, degrees of multithreading, and machine sizes. We determine that multithreading is only useful for approximately two to four threads; any additional interleaving does not decrease the overall latency. For small machines and high locality applications, this limitation is due mainly to the length of the running threads. For large machines with medium to low locality, this limitation is due mainly to the protocol overhead being too large. Our study using the embedded model shows that in situations where the run length between references to shared memory is at least an order of magnitude longer than the time to process a single state transition in the protocol, applications exhibit good performance. If separate controllers for processing protocol requests are included, the protocol scales to 32k processor machines as long as the application exhibits hierarchical locality: at least 22% of the global references must be able to be satisfied locally; at most 35% of the global references are allowed to reach the top level of the hierarchy.
Resumo:
This paper provides recent evidence about the beneÖts of attending preschool on future performance. A non-parametric matching procedure is used over two outcomes: math and verbal scores at a national mandatory test (Saber11) in Colombia. It is found that students who had the chance of attending preschool obtain higher scores in math (6.7%) and verbal (5.4%) than those who did not. A considerable fraction of these gaps comes from the upper quintiles of studentís performance, suggesting that preschool matters when is done at high quality institutions. When we include the number of years at the preschool, the gap rises up to 12% in verbal and 17% in math.
Resumo:
The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.
Resumo:
A case of long-range transport of a biomass burning plume from Alaska to Europe is analyzed using a Lagrangian approach. This plume was sampled several times in the free troposphere over North America, the North Atlantic and Europe by three different aircraft during the IGAC Lagrangian 2K4 experiment which was part of the ICARTT/ITOP measurement intensive in summer 2004. Measurements in the plume showed enhanced values of CO, VOCs and NOy, mainly in form of PAN. Observed O3 levels increased by 17 ppbv over 5 days. A photochemical trajectory model, CiTTyCAT, was used to examine processes responsible for the chemical evolution of the plume. The model was initialized with upwind data and compared with downwind measurements. The influence of high aerosol loading on photolysis rates in the plume was investigated using in situ aerosol measurements in the plume and lidar retrievals of optical depth as input into a photolysis code (Fast-J), run in the model. Significant impacts on photochemistry are found with a decrease of 18% in O3 production and 24% in O3 destruction over 5 days when including aerosols. The plume is found to be chemically active with large O3 increases attributed primarily to PAN decomposition during descent of the plume toward Europe. The predicted O3 changes are very dependent on temperature changes during transport and also on water vapor levels in the lower troposphere which can lead to O3 destruction. Simulation of mixing/dilution was necessary to reproduce observed pollutant levels in the plume. Mixing was simulated using background concentrations from measurements in air masses in close proximity to the plume, and mixing timescales (averaging 6.25 days) were derived from CO changes. Observed and simulated O3/CO correlations in the plume were also compared in order to evaluate the photochemistry in the model. Observed slopes change from negative to positive over 5 days. This change, which can be attributed largely to photochemistry, is well reproduced by multiple model runs even if slope values are slightly underestimated suggesting a small underestimation in modeled photochemical O3 production. The possible impact of this biomass burning plume on O3 levels in the European boundary layer was also examined by running the model for a further 5 days and comparing with data collected at surface sites, such as Jungfraujoch, which showed small O3 increases and elevated CO levels. The model predicts significant changes in O3 over the entire 10 day period due to photochemistry but the signal is largely lost because of the effects of dilution. However, measurements in several other BB plumes over Europe show that O3 impact of Alaskan fires can be potentially significant over Europe.
Resumo:
Long-term monitoring of forest soils as part of a pan-European network to detect environmental change depends on an accurate determination of the mean of the soil properties at each monitoring event. Forest soil is known to be very variable spatially, however. A study was undertaken to explore and quantify this variability at three forest monitoring plots in Britain. Detailed soil sampling was carried out, and the data from the chemical analyses were analysed by classical statistics and geostatistics. An analysis of variance showed that there were no consistent effects from the sample sites in relation to the position of the trees. The variogram analysis showed that there was spatial dependence at each site for several variables and some varied in an apparently periodic way. An optimal sampling analysis based on the multivariate variogram for each site suggested that a bulked sample from 36 cores would reduce error to an acceptable level. Future sampling should be designed so that it neither targets nor avoids trees and disturbed ground. This can be achieved best by using a stratified random sampling design.
Resumo:
Even if we have recognized many short-term benefits of agile methods, we still know very little about their long-term effects. In this panel, we discuss the long-term perspective of the agile methods. The panelists are either industrial or academic representatives. They will discuss problems and benefits related to the long-term lifecycle system management in agile projects. Ideally, the panel’s outcome will provide ideas for future research.
Resumo:
Long-term effects of the elevated atmospheric CO2 on biosphere have been in focus of research since the last few decades. In this experiment undisturbed soil monoliths of loess grassland were exposed to an elevated CO2 environment (two-times the ambient CO2 level) for a period of six years with the aid of the open top chamber method. Control without a chamber and CO2 elevation was applied as well. Elevated CO2 level had very little impact oil soil food web. It did not influence either root and microbial biomass or microbial and nematode community structure. The only significant response was that density of the bacterial feeder genus Heterocephalobus increased in the chamber with elevated CO2 concentration. Application of the open top chambers initiated more changes on nematodes than the elevated CO2 level. Open top chamber (OTC) method decreased nematode density (total and plant feeder as well) to less than half of the original level. Negative effect was found on the genus level in the case of fungal feeder Aphelenchoides, plant feeder Helicotylenchus and Paratylenchus. It is very likely that the significantly lower belowground root biomass and partly its decreased quality reflected by the increased C/N ratio are the main responsible factors for the lower density of the plant feeder nematodes in the plots of chambers. According to diversity profiles, MI and MI(2-15) parameters, nematode communities in the open top chambers (both on ambient and elevated CO2 level) seem to be more structured than those under normal circumstances six years after start of the experiment.
Resumo:
The suitability of cryopreservation for the secure, long-term storage of the rare and endangered species Cosmos atrosanguineus was investigated. Using encapsulation/dehydration of shoot tips in alginate strips, survival rates of up to 100 % and shoot regeneration of up to 35 % were achieved. Light and electron microscopy studies indicated that cellular damage to some regions of the shoot tip during the freeze/thaw procedure was high, although cell survival in and around the meristematic region allowed shoot tip regeneration. The genetic fingerprinting technique, amplified fragment length polymorphisms (AFLPs), showed that no detectable genetic variation was present between material of C. atrosanguineus at the time of initiation into tissue culture and that which had been cryopreserved, stored in liquid nitrogen for 12 months and regenerated. Wearied plantlets that were grown under glasshouse conditions exhibited no morphological variation from non-frozen controls. (C) 2003 Annals of Botany Company.
Resumo:
We used the PCR to study the presence of two plant pathogens in archived wheat samples from a long-term experiment started in 1843. The data were used to construct a unique 160-yr time-series of the abundance of Phaeosphaeria nodorum and Mycosphaerella graminicola, two important pathogens of wheat. During the period since 1970, the relative abundance of DNA of these two pathogens in the samples has reflected the relative importance of the two wheat diseases they cause in U.K. disease surveys. Unexpectedly, changes in the ratio of the pathogens over the 160-yr period were very strongly correlated with changes in atmospheric pollution, as measured by SO2 emissions. This finding suggests that long-term, economically important, changes in pathogen populations can be influenced by anthropogenically induced environmental changes.
Resumo:
It has long been suggested that the overall shape of the antigen combining site (ACS) of antibodies is correlated with the nature of the antigen. For example, deep pockets are characteristic of antibodies that bind haptens, grooves indicate peptide binders, while antibodies that bind to proteins have relatively flat combining sites. In. 1996, MacCallum, Martin and Thornton used a fractal shape descriptor and showed a strong correlation of the shape of the binding region with the general nature of the antigen. However, the shape of the ACS is determined primarily by the lengths of the six complementarity-determining regions (CDRs). Here, we make a direct correlation between the lengths of the CDRs and the nature of the antigen. In addition, we show significant differences in the residue composition of the CDRs of antibodies that bind to different antigen classes. As well as helping us to understand the process of antigen recognition, autoimmune disease and cross-reactivity these results are of direct application in the design of antibody phage libraries and modification of affinity. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
The suitability of cryopreservation for the secure, long-term storage of the rare and endangered species Cosmos atrosanguineus was investigated. Using encapsulation/dehydration of shoot tips in alginate strips, survival rates of up to 100 % and shoot regeneration of up to 35 % were achieved. Light and electron microscopy studies indicated that cellular damage to some regions of the shoot tip during the freeze/thaw procedure was high, although cell survival in and around the meristematic region allowed shoot tip regeneration. The genetic fingerprinting technique, amplified fragment length polymorphisms (AFLPs), showed that no detectable genetic variation was present between material of C. atrosanguineus at the time of initiation into tissue culture and that which had been cryopreserved, stored in liquid nitrogen for 12 months and regenerated. Weaned plantlets that were grown under glasshouse conditions exhibited no morphological variation from non-frozen controls.
Resumo:
Fully connected cubic networks (FCCNs) are a class of newly proposed hierarchical interconnection networks for multicomputer systems, which enjoy the strengths of constant node degree and good expandability. The shortest path routing in FCCNs is an open problem. In this paper, we present an oblivious routing algorithm for n-level FCCN with N = 8(n) nodes, and prove that this algorithm creates a shortest path from the source to the destination. At the costs of both an O(N)-parallel-step off-line preprocessing phase and a list of size N stored at each node, the proposed algorithm is carried out at each related node in O(n) time. In some cases the proposed algorithm is superior to the one proposed by Chang and Wang in terms of the length of the routing path. This justifies the utility of our routing strategy. (C) 2006 Elsevier Inc. All rights reserved.
Resumo:
With the wide acceptance of the long-chain (LC) n-3 PUFA EPA and DHA as important nutrients playing a role in the amelioration of certain diseases, efforts to understand factors affecting intakes of these fatty acids along with potential strategies to increase them are vital. Widespread aversion to oil-rich fish, the richest natural source of EPA and DHA, highlights both the highly suboptimal current intakes in males and females across all age-groups and the critical need for an alternative supply of EPA and DHA. Poultry meat is a popular and versatile food eaten in large quantities relative to other meats and is open to increased LC n-3 PUFA content through manipulation of the chicken's diet to modify fatty acid deposition and therefore lipid composition of the edible tissues. It is therefore seen as a favourable prototype food for increasing human dietary supply of LC n-3 PUFA. Enrichment of chicken breast and leg tissue is well established using fish oil or fishmeal, but concerns about sustainability have led to recent consideration of algal biomass as an alternative source of LC n-3 PUFA. Further advances have also been made in the quality of the resulting meat, including achieving acceptable flavour and storage properties as well as understanding the impact of cooking on the retention of fatty acids. Based on these considerations it may be concluded that EPA- and DHA-enriched poultry meat has a very positive potential future in the food chain.
Resumo:
This work investigates the optimum decision delay and tap-length of the finite-length decision feedback equalizer. First we show that, if the feedback filter (FBF) length Nb is equal to or larger than the channel memory v and the decision delay Δ is smaller than the feedforward filter (FFF) length Nf, then only the first Δ+1 elements of the FFF can be nonzero. Based on this result we prove that the maximum effective FBF length is equal to the channel memory v, and if Nb ≥ v and Nf is long enough, the optimum decision delay that minimizes the MMSE is Nf-1.