979 resultados para codes over rings
Resumo:
Seventy male lambs over 10 weeks of age were castrated using Burdizzo, rubber rings, or surgery to assess the acute and long-term effects of castration. All castrations were performed under local anaesthesia. The surgically castrated lambs were additionally sedated with xylazine and the sedation reversed with tolazoline. The frequency of abnormal postures and immediate behavioural responses indicated that surgically castrated lambs were most distressed; the lambs castrated using Burdizzo and rubber rings were not dissimilar to those of the control group. Between 1.5 and 9h after castration, signs of pain and distress were at a lower level in lambs anaesthetised with bupivacaine compared with those treated with lidocaine. Due to the markedly faster wound healing, Burdizzo castration seemed to be preferable (fewer signs of long-term pain) when compared to the rubber ring technique. It was concluded that local anaesthesia with bupivacaine, followed by the Burdizzo method is the preferable technique for the castration of lambs older than 10 weeks of age.
Resumo:
In a first step to obtain a proxy record of past climatic events (including the El Ni (n) over tildeo-Southern Oscillation) in the normally aseasonal tropical environment of Sabah, a radial segment from a recently fallen dipterocarp (Shorea Superba) was radiocarbon dated and subjected to carbon isotope analysis. The high-precision radiocarbon results fell into the ambiguous modern plateau where several calibrated dates can exist for each sample. Dating was achieved by wiggle matching using a Bayesian approach to calibration. Using the defined growth characteristics of Shorea superba, probability density distributions were calculated and improbable dates rejected. It was found that the tree most likely started growing around AD 1660-1685. A total of 173 apparent growth increments were measured and, therefore, it could be determined that the tree formed one ring approximately every two years. Stable carbon isotope values were obtained from resin-extracted wholewood from each ring. Carbon cycling is evident in the `juvenile effect', resulting from the assimilation of respired carbon dioxide and lower light levels below the canopy, and in the `anthropogenic effect' caused by increased industrial activity in the late-nineteenth and twentieth centuries. This study demonstrates that palaeoenvironmental information can be obtained from trees growing in aseasonal environments, where climatic conditions prevent the formation of well-defined annual rings.
Resumo:
Glacier fluctuations are a key indicator of changing climate. Their reconstruction beyond historical times unravels glacier variability and its forcing factors on long time scales, which can considerably improve our understanding of the climate–glacier relationship. Here, we present a 2250-year-long reconstruction of particle-mass accumulation rates recorded in the lacustrine sediments of Lake Trüebsee (Central Swiss Alps) that are directly related to glacier extent, thus reflecting a continuous record of fluctuations of the upstream-located Titlis Glacier. Mass accumulation rate values show strong centennial to multi-centennial fluctuations and reveal 12 well-pronounced periods of enhanced values corresponding to times of maximum extent of the neighboring Lower Grindelwald Glacier. This result supports previous studies of proglacial lake sediments that documented high mass accumulation rate values during glacier advances. The strong variability in the Lake Trüebsee mass accumulation rate record thus represents a highly sensitive paleoclimatic archive, which mirrors rapid and pronounced feedbacks of Titlis Glacier to climatic changes over the past 2250years. The comparison of our data with independent paleo-temperature reconstructions from tree rings suggests that variations in mean summer temperature were the primary driving factor of fluctuations of Titlis Glacier. Also, advances of Titlis Glacier occurred during the grand solar minima (Dalton, Maunder, Spörer, Wolf) of the last millennium. This relation of glacier extent with summer temperature reveals strong evidence that the mass balance of this Alpine glacier is primarily controlled by the intensity of glacier melting during summer.
Resumo:
In equatorial regions, where tree rings are less distinct or even absent, the response of forests to high-frequency climate variability is poorly understood. We measured stable carbon and oxygen isotopes in anatomically distinct, annual growth rings of four Pericopsis elata trees from a plantation in the Congo Basin, to assess their sensitivity to recorded changes in precipitation over the last 50 y. Our results suggest that oxygen isotopes have high common signal strength (EPS = 0.74), and respond to multi-annual precipitation variability at the regional scale, with low δ18O values (28–29‰) during wetter conditions (1960–1970). Conversely, δ13C are mostly related to growth variation, which in a light-demanding species are driven by competition for light. Differences in δ13C values between fast- and slow-growing trees (c. 2‰), result in low common signal strength (EPS = 0.37) and are driven by micro-site conditions rather than by climate. This study highlights the potential for understanding the causes of growth variation in P. elata as well as past hydroclimatic changes, in a climatically complex region characterized by a bimodal distribution in precipitation.
Resumo:
The development of a global instability analysis code coupling a time-stepping approach, as applied to the solution of BiGlobal and TriGlobal instability analysis 1, 2 and finite-volume-based spatial discretization, as used in standard aerodynamics codes is presented. The key advantage of the time-stepping method over matrix-formulation approaches is that the former provides a solution to the computer-storage issues associated with the latter methodology. To-date both approaches are successfully in use to analyze instability in complex geometries, although their relative advantages have never been quantified. The ultimate goal of the present work is to address this issue in the context of spatial discretization schemes typically used in industry. The time-stepping approach of Chiba 3 has been implemented in conjunction with two direct numerical simulation algorithms, one based on the typically-used in this context high-order method and another based on low-order methods representative of those in common use in industry. The two codes have been validated with solutions of the BiGlobal EVP and it has been showed that small errors in the base flow do not have affect significantly the results. As a result, a three-dimensional compressible unsteady second-order code for global linear stability has been successfully developed based on finite-volume spatial discretization and time-stepping method with the ability to study complex geometries by means of unstructured and hybrid meshes
Resumo:
Within the last century the interest in wind-induced loads over civil engineering structures has become more and more important, the reason being that the development of new construction techniques and materials has allowed engineers and architects to design new structures far from the traditional concepts, and in many cases wind actions over these singular structures are not included in the existing codes of practice. In this paper the windinduced static loads over bridges constructed by the double cantilever method during erection stages are considered. The aerodynamic load over a double cantilever bridge under a yawing-angled wind produces a yawing (torsional) moment on the bridge deck, which can lead to undesirable rotation of the deck about the supporting pier. The effects of the wind yaw angle and the length of the deck are analysed. The wind action caused by the presence of sliding concrete forms at the ends of the deck is also studied.
Resumo:
Multielectrode recording techniques were used to record ensemble activity from 10 to 16 simultaneously active CA1 and CA3 neurons in the rat hippocampus during performance of a spatial delayed-nonmatch-to-sample task. Extracted sources of variance were used to assess the nature of two different types of errors that accounted for 30% of total trials. The two types of errors included ensemble “miscodes” of sample phase information and errors associated with delay-dependent corruption or disappearance of sample information at the time of the nonmatch response. Statistical assessment of trial sequences and associated “strength” of hippocampal ensemble codes revealed that miscoded error trials always followed delay-dependent error trials in which encoding was “weak,” indicating that the two types of errors were “linked.” It was determined that the occurrence of weakly encoded, delay-dependent error trials initiated an ensemble encoding “strategy” that increased the chances of being correct on the next trial and avoided the occurrence of further delay-dependent errors. Unexpectedly, the strategy involved “strongly” encoding response position information from the prior (delay-dependent) error trial and carrying it forward to the sample phase of the next trial. This produced a miscode type error on trials in which the “carried over” information obliterated encoding of the sample phase response on the next trial. Application of this strategy, irrespective of outcome, was sufficient to reorient the animal to the proper between trial sequence of response contingencies (nonmatch-to-sample) and boost performance to 73% correct on subsequent trials. The capacity for ensemble analyses of strength of information encoding combined with statistical assessment of trial sequences therefore provided unique insight into the “dynamic” nature of the role hippocampus plays in delay type memory tasks.
Resumo:
Other editions published under title: 1895: Forms and precedents; 1918: Form book; 1933: Forms.
Resumo:
Allegory is not obsolete as Samuel Coleridge and Johann Wolfgang von Goethe have claimed. It is alive and well and has transformed from a restrictive concept to a concept that is flexible and can form to meet the needs of the author or reader. The most efficient way to evidence this is by making a case study of it with a suitable work that will allow us to perceive its plasticity. This essay uses J.R.R. Tolkien’s The Lord of the Rings as a multi-perspective case study of the concept of allegory; the size and complexity of the narrative make it a suitable choice. My aim is to illustrate the plasticity of allegory as a concept and illuminate some of the possibilities and pitfalls of allegory and allegoresis. As to whether The Lord of the Rings can be treated as an allegory, it will be examined from three different perspectives: as a purely writerly process, a middle ground of writer and reader and as a purely readerly process. The Lord of the Rings will then be compared to a series of concepts of allegorical theory such as Plato’s classical “The Ring of Gyges”, William Langland’s classic The Vision of William Concerning Piers the Plowman and contemporary allegories of racism and homoeroticism to demonstrate just how adaptable this concept is. The position of this essay is that the concept of allegory has changed over time since its conception and become more malleable. This poses certain dangers as allegory has become an all-round tool for anyone to do anything that has few limitations and has lost its early rigid form and now favours an almost anything goes approach.
Resumo:
We review recent theoretical progress on the statistical mechanics of error correcting codes, focusing on low-density parity-check (LDPC) codes in general, and on Gallager and MacKay-Neal codes in particular. By exploiting the relation between LDPC codes and Ising spin systems with multispin interactions, one can carry out a statistical mechanics based analysis that determines the practical and theoretical limitations of various code constructions, corresponding to dynamical and thermodynamical transitions, respectively, as well as the behaviour of error-exponents averaged over the corresponding code ensemble as a function of channel noise. We also contrast the results obtained using methods of statistical mechanics with those derived in the information theory literature, and show how these methods can be generalized to include other channel types and related communication problems.
Resumo:
We obtain phase diagrams of regular and irregular finite-connectivity spin glasses. Contact is first established between properties of the phase diagram and the performance of low-density parity check (LDPC) codes within the replica symmetric (RS) ansatz. We then study the location of the dynamical and critical transition points of these systems within the one step replica symmetry breaking theory (RSB), extending similar calculations that have been performed in the past for the Bethe spin-glass problem. We observe that the location of the dynamical transition line does change within the RSB theory, in comparison with the results obtained in the RS case. For LDPC decoding of messages transmitted over the binary erasure channel we find, at zero temperature and rate R=14, an RS critical transition point at pc 0.67 while the critical RSB transition point is located at pc 0.7450±0.0050, to be compared with the corresponding Shannon bound 1-R. For the binary symmetric channel we show that the low temperature reentrant behavior of the dynamical transition line, observed within the RS ansatz, changes its location when the RSB ansatz is employed; the dynamical transition point occurs at higher values of the channel noise. Possible practical implications to improve the performance of the state-of-the-art error correcting codes are discussed. © 2006 The American Physical Society.
Resumo:
Typical properties of sparse random matrices over finite (Galois) fields are studied, in the limit of large matrices, using techniques from the physics of disordered systems. For the case of a finite field GF(q) with prime order q, we present results for the average kernel dimension, average dimension of the eigenvector spaces and the distribution of the eigenvalues. The number of matrices for a given distribution of entries is also calculated for the general case. The significance of these results to error-correcting codes and random graphs is also discussed.
Resumo:
There is a growing demand for data transmission over digital networks involving mobile terminals. An important class of data required for transmission over mobile terminals is image information such as street maps, floor plans and identikit images. This sort of transmission is of particular interest to the service industries such as the Police force, Fire brigade, medical services and other services. These services cannot be applied directly to mobile terminals because of the limited capacity of the mobile channels and the transmission errors caused by the multipath (Rayleigh) fading. In this research, transmission of line diagram images such as floor plans and street maps, over digital networks involving mobile terminals at transmission rates of 2400 bits/s and 4800 bits/s have been studied. A low bit-rate source encoding technique using geometric codes is found to be suitable to represent line diagram images. In geometric encoding, the amount of data required to represent or store the line diagram images is proportional to the image detail. Thus a simple line diagram image would require a small amount of data. To study the effect of transmission errors due to mobile channels on the transmitted images, error sources (error files), which represent mobile channels under different conditions, have been produced using channel modelling techniques. Satisfactory models of the mobile channel have been obtained when compared to the field test measurements. Subjective performance tests have been carried out to evaluate the quality and usefulness of the received line diagram images under various mobile channel conditions. The effect of mobile transmission errors on the quality of the received images has been determined. To improve the quality of the received images under various mobile channel conditions, forward error correcting codes (FEC) with interleaving and automatic repeat request (ARQ) schemes have been proposed. The performance of the error control codes have been evaluated under various mobile channel conditions. It has been shown that a FEC code with interleaving can be used effectively to improve the quality of the received images under normal and severe mobile channel conditions. Under normal channel conditions, similar results have been obtained when using ARQ schemes. However, under severe mobile channel conditions, the FEC code with interleaving shows better performance.
Resumo:
In this thesis we use statistical physics techniques to study the typical performance of four families of error-correcting codes based on very sparse linear transformations: Sourlas codes, Gallager codes, MacKay-Neal codes and Kanter-Saad codes. We map the decoding problem onto an Ising spin system with many-spins interactions. We then employ the replica method to calculate averages over the quenched disorder represented by the code constructions, the arbitrary messages and the random noise vectors. We find, as the noise level increases, a phase transition between successful decoding and failure phases. This phase transition coincides with upper bounds derived in the information theory literature in most of the cases. We connect the practical decoding algorithm known as probability propagation with the task of finding local minima of the related Bethe free-energy. We show that the practical decoding thresholds correspond to noise levels where suboptimal minima of the free-energy emerge. Simulations of practical decoding scenarios using probability propagation agree with theoretical predictions of the replica symmetric theory. The typical performance predicted by the thermodynamic phase transitions is shown to be attainable in computation times that grow exponentially with the system size. We use the insights obtained to design a method to calculate the performance and optimise parameters of the high performance codes proposed by Kanter and Saad.
Resumo:
Focal points: ICD-10 codings and spontaneous yellow card reports for warfarin toxicity were compared retrospectively over a one-year period Eighteen cases of ICD-10 coded warfarin toxicity were identified from a total of 55,811 coded episodes More than three times as many ADRs to warfarin were found by screening ICD-10 codes as were reported spontaneously using the yellow card scheme Valuable information is being lost to regulatory authorities and as recognised reporters to the yellow card scheme, pharmacists are well placed to report these ADRs, enhancing their role in the safe and appropriate prescribing of warfarin