953 resultados para stages of anesthesia
Resumo:
In closed-die forging the flash geometry should be such as to ensure that the cavity is completely filled just as the two dies come into contact at the parting plane. If metal is caused to extrude through the flash gap as the dies approach the point of contact — a practice generally resorted to as a means of ensuring complete filling — dies are unnecessarily stressed in a high-stress regime (as the flash is quite thin and possibly cooled by then), which reduces the die life and unnecessarily increases the energy requirement of the operation. It is therefore necessary to carefully determine the dimensions of the flash land and flash thickness — the two parameters, apart from friction at the land, which control the lateral flow. The dimensions should be such that the flow into the longitudinal cavity is controlled throughout the operation, ensuring complete filling just as the dies touch at the parting plane. The design of the flash must be related to the shape and size of the forging cavity as the control of flow has to be exercised throughout the operation: it is possible to do this if the mechanics of how the lateral extrusion into the flash takes place is understood for specific cavity shapes and sizes. The work reported here is part of an ongoing programme investigating flow in closed-die forging. A simple closed shape (no longitudinal flow) which may correspond to the last stages of a real forging operation is analysed using the stress equilibrium approach. Metal from the cavity (flange) flows into the flash by shearing in the cavity in one of the three modes considered here: for a given cavity the mode with the least energy requirement is assumed to be the most realistic. On this basis a map has been developed which, given the depth and width of the cavity as well as the flash thickness, will tell the designer of the most likely mode (of the three modes considered) in which metal in the cavity will shear and then flow into the flash gap. The results of limited set of experiments, reported herein, validate this method of selecting the optimum model of flow into the flash gap.
Resumo:
The hypothesis that contaminant plants growing amongst chickpea serve as Helicoverpa sinks by diverting oviposition pressure away from the main crop was tested under field conditions. Gain (recruitment) and loss (presumed mortality) of juvenile stages of Helicoverpa spp. on contaminant faba bean and wheat plants growing in chickpea plots were quantified on a daily basis over a 12-d period. The possibility of posteclosion movement of larvae from the contaminants to the surrounding chickpea crop was examined. Estimated total loss of the census population varied from 80 to 84% across plots and rows. The loss of brown eggs (40–47%) contributed most to the overall loss estimate, followed by loss of white eggs (27–35%) and larvae (6–9%). The cumulative number of individuals entering the white and brown egg and larval stages over the census period ranged from 15 to 58, 10–48 and 1–6 per m row, respectively. The corresponding estimates of mean stage-specific loss, expressed as a percentage of individuals entering the stage, ranged from 52 to 57% for white eggs, 87–108% for brown eggs and 71–87% for first-instar larvae. Mean larval density on chickpea plants in close proximity to the contaminant plants did not exceed the baseline larval density on chickpea further away from the contaminants across rows and plots. The results support the hypothesis that contaminant plants in chickpea plots serve as Helicoverpa sinks by diverting egg pressure from the main crop and elevating mortality of juvenile stages. Deliberate contamination of chickpea crops with other plant species merits further investigation as a cultural pest management strategy for Helicoverpa spp.
Resumo:
The application of attenuated vaccines for the prevention of chicken coccidiosis has increased exponentially in recent years. In Eimeria infections, protective immunity is thought to rely on a strong cell mediated response with antibodies supposedly playing a minor role. However, under certain conditions antibodies seem to be significant in protection. Furthermore, antibodies could be useful for monitoring natural exposure of flocks to Eimeria spp. and for monitoring the infectivity of live vaccines. Our objective was to investigate the chicken antibody response to the different parasite lifecycle stages following infection with an attenuated strain of Eimeria tenella. Western blotting analysis of parasite antigens prepared from the lining of caeca infected with the attenuated strain of E. tenella revealed two dominant antigens of 32 and 34 kDa, apparently associated with trophozoites and merozoites that were present at high concentrations between 84 and 132 h post-infection. When cryosections of caeca infected with E. tenella were probed with IgY purified from immune birds the most intense reaction was observed with the asexual stages. Western blotting analysis of proteins of purified sporozoites and third generation merozoites and absorption of stage-specific antibodies from sera suggested that a large proportion of antigens is shared by the two stages. The time-courses of the antibody response to sporozoite and merozoite antigens were similar but varied depending on the inoculation regime and the degree of oocyst recirculation.
Resumo:
Time to first root in cuttings varies under different environmental conditions and understanding these differences is critical for optimizing propagation of commercial forestry species. Temperature environment (15, 25, 30 or 35 +/- A 2A degrees C) had no effect on the cellular stages in root formation of the Slash x Caribbean Pine hybrid over 16 weeks as determined by histology. Initially callus cells formed in the cortex, then tracheids developed and formed primordia leading to external roots. However, speed of development followed a growth curve with the fastest development occurring at 25A degrees C and slowest at 15A degrees C with rooting percentages at week 12 of 80 and 0% respectively. Cutting survival was good in the three cooler temperature regimes (> 80%) but reduced to 59% at 35A degrees C. Root formation appeared to be dependant on the initiation of tracheids because all un-rooted cuttings had callus tissue but no tracheids, irrespective of temperature treatment and clone.
Resumo:
Time to first root in cuttings varies under different environmental conditions and understanding these differences is critical for optimizing propagation of commercial forestry species. Temperature environment (15, 25, 30 or 352C) had no effect on the cellular stages in root formation of the Slash * Caribbean Pine hybrid over 16 weeks as determined by histology. Initially callus cells formed in the cortex, then tracheids developed and formed primordia leading to external roots. However, speed of development followed a growth curve with the fastest development occurring at 25C and slowest at 15C with rooting percentages at week 12 of 80 and 0% respectively. Cutting survival was good in the three cooler temperature regimes (>80%) but reduced to 59% at 35C. Root formation appeared to be dependant on the initiation of tracheids because all un-rooted cuttings had callus tissue but no tracheids, irrespective of temperature treatment and clone.
Resumo:
Vegetative propagation programs internationally are affected by the significant decline of rooting success as trees mature. This study compared the cellular stages of root formation in stem cuttings from 15-week-old (juvenile) and 9-y-old (mature) stock plants of the slash x Caribbean pine hybrid (Pinus elliottii var. elliottii x P. caribaea van hondurensis). The cellular stages of root formation were the same in both juvenile and mature cuttings, beginning with cell divisions of the vascular cambium forming callus tissue. Within the callus, tracheids differentiated and elongated to form root primordia. Roots in juvenile cuttings developed faster than those in mature cuttings and the juvenile cuttings had a much higher rooting percent at the end of the study (92% and 26% respectively). Cuttings of the two juvenile genotypes had more primary roots (5.5 and 3.3) than the three mature genotypes (0.96, 0.18 and 0.07). The roots of juvenile cuttings were more evenly distributed around the basal circumference when compared with those on cuttings from the mature genotypes. Further work is needed to improve understanding of physiological changes with maturation so that the rooting success and the speed of development in cuttings from mature stock plants can be optimised, hence improving genetic gain.
Resumo:
Until August 2007, Australia was one of only three countries internationally recognised to be free of equine influenza (EI). This report documents the diagnosis of the first cases of EI in Australian horses and summarises the investigations that took place over the next 5 days. During that time, a multifocal outbreak was identified across eastern New South Wales and south-eastern Queensland. The use of an influenza type A pan-reactive real-time reverse transcription polymerase chain reaction allowed rapid confirmation of suspect cases of EI.
Resumo:
Fumigation of stored grain with phosphine (PH 3) is used widely to control the lesser grain borer Rhyzopertha dominica. However, development of high level resistance to phosphine in this species threatens control. Effective resistance management relies on knowledge of the expression of resistance in relation to dosage at all life stages. Therefore, we determined the mode of inheritance of phosphine resistance and strength of the resistance phenotype at each developmental stage. We achieved this by comparing mortality and developmental delay between a strongly resistant strain (R-strain), a susceptible strain (S-strain) and their F 1 progenies. Resistance was a maternally inherited, semi-dominant trait in the egg stage but was inherited as an autosomal, incompletely recessive trait in larvae and pupae. The rank order of developmental tolerance in both the sensitive and resistant strains was eggs > pupae > larvae. Comparison of published values for the response of adult R. dominica relative to our results from immature stages reveals that the adult stage of the S-strain is more sensitive to phosphine than are larvae. This situation is reversed in the R-strain as the adult stage is much more resistant to phosphine than even the most tolerant immature stage. Phosphine resistance factors at LC 50 were eggs 400×, larvae 87× and pupae 181× with respect to reference susceptible strain (S-strain) adults indicating that tolerance conferred by a particular immature stage neither strongly nor reliably interacts with the genetic resistance element. Developmental delay relative to unfumigated control insects was observed in 93% of resistant pupae, 86% of resistant larvae and 41% of resistant eggs. Increased delay in development and the toxicity response to phosphine exposure were both incompletely recessive. We show that resistance to phosphine has pleiotropic effects and that the expression of these effects varies with genotype and throughout the life history of the insect. © 2012.
Resumo:
BACKGROUND Our aim was to ascertain the potential of sulfuryl fluoride (SF) as an alternative fumigant to manage phosphine-resistant pests. We tested the susceptibility of all life stages of red flour beetle, Tribolium castaneum (Herbst), to SF and assessed the presence of cross-resistance to this fumigant in phosphine-resistant strains of this species. RESULTS Analysis of dose–response data indicated that the egg was the stage most tolerant to SF under a 48 h exposure period. At LC50, eggs were 29 times more tolerant than other immature stages and adults, and required a relatively high concentration of 48.2 mg L−1 for complete mortality. No significant differences in tolerance to SF were observed among the three larval instars, pupae and adults, and all of these stages were controlled at a low concentration of 1.32 mg L−1. Phosphine-resistant strains did not show cross-resistance to SF. CONCLUSION Our research concluded that the current maximum registered rate of SF, 1500 gh m−3, is adequate to control all the post-embryonic life stages of T. castaneum over a 48 h fumigation period, but it will fail to achieve complete mortality of eggs, indicating the risk of some survival of eggs under this short exposure period. As there is no cross-resistance to SF in phosphine-resistant insects, it will play a key role in managing phosphine resistance in stored-grain insect pests. © 2014 Commonwealth of Australia. Pest Management Science © 2014 Society of Chemical Industry
Resumo:
Chronic kidney disease (CKD) is increasing globally and in Saudi Arabia it affects approximately 8% annual increment of dialysis population. It is associated with a high symptom burden. Previous studies have largely reported on the prevalence of symptoms only in the haemodialysis population. This study examined symptom burden across disease stages and treatment groups in advanced CKD, and their correlation with demographic and clinical factors. Using a cross-sectional design, a convenience sample of 436 patients with CKD was recruited from three hospitals in Saudi Arabia. The CKD Symptom Burden Index (CKD-SBI) was used to measure 32 CKD symptoms. Demographic and clinical data were also collected. Of the sample 75.5% were receiving dialysis (haemodialysis, n = 287; peritoneal dialysis, n = 42) and 24.5% were non-dialysis (CKD stage 4, n = 69; CKD stage 5, n = 38). Average symptom reported was 13.01 ± 7.67. Fatigue and pain were common and burdensome across all symptom dimensions.Approximately one-third of participants experienced sexual symptoms. Dialysis patients reported greater symptom burden, especially patients on haemodialysis. Haemodialysis treatment, older age and being female were independently associated with greater total symptom burden. In conclusion, symptom burden is high among advanced stages of CKD, particularly among those receiving dialysis. Although fatigue, pain and sexual dysfunction are key contributors to symptom burden in CKD, these symptoms are often under-recognised and warrant routine assessment. The CKD-SBI offers a valuable tool to assess symptom burden, leading to the commencement of timely and appropriate interventions.
Resumo:
Several hypnosis monitoring systems based on the processed electroencephalogram (EEG) have been developed for use during general anesthesia. The assessment of the analgesic component (antinociception) of general anesthesia is an emerging field of research. This study investigated the interaction of hypnosis and antinociception, the association of several physiological variables with the degree of intraoperative nociception, and aspects of EEG Bispectral Index Scale (BIS) monitoring during general anesthesia. In addition, EEG features and heart rate (HR) responses during desflurane and sevoflurane anesthesia were compared. A propofol bolus of 0.7 mg/kg was more effective than an alfentanil bolus of 0.5 mg in preventing the recurrence of movement responses during uterine dilatation and curettage (D C) after a propofol-alfentanil induction, combined with nitrous oxide (N2O). HR and several HR variability-, frontal electromyography (fEMG)-, pulse plethysmography (PPG)-, and EEG-derived variables were associated with surgery-induced movement responses. Movers were discriminated from non-movers mostly by the post-stimulus values per se or normalized with respect to the pre-stimulus values. In logistic regression analysis, the best classification performance was achieved with the combination of normalized fEMG power and HR during D C (overall accuracy 81%, sensitivity 53%, specificity 95%), and with the combination of normalized fEMG-related response entropy, electrocardiography (ECG) R-to-R interval (RRI), and PPG dicrotic notch amplitude during sevoflurane anesthesia (overall accuracy 96%, sensitivity 90%, specificity 100%). ECG electrode impedances after alcohol swab skin pretreatment alone were higher than impedances of designated EEG electrodes. The BIS values registered with ECG electrodes were higher than those registered simultaneously with EEG electrodes. No significant difference in the time to home-readiness after isoflurane-N2O or sevoflurane-N2O anesthesia was found, when the administration of the volatile agent was guided by BIS monitoring. All other early and intermediate recovery parameters were also similar. Transient epileptiform EEG activity was detected in eight of 15 sevoflurane patients during a rapid increase in the inspired volatile concentration, and in none of the 16 desflurane patients. The observed transient EEG changes did not adversely affect the recovery of the patients. Following the rapid increase in the inhaled desflurane concentration, HR increased transiently, reaching its maximum in two minutes. In the sevoflurane group, the increase was slower and more subtle. In conclusion, desflurane may be a safer volatile agent than sevoflurane in patients with a lowered seizure threshold. The tachycardia induced by a rapid increase in the inspired desflurane concentration may present a risk for patients with heart disease. Designated EEG electrodes may be superior to ECG electrodes in EEG BIS monitoring. When the administration of isoflurane or sevoflurane is adjusted to maintain BIS values at 50-60 in healthy ambulatory surgery patients, the speed and quality of recovery are similar after both isoflurane-N2O and sevoflurane-N2O anesthesia. When anesthesia is maintained by the inhalation of N2O and bolus doses of propofol and alfentanil in healthy unparalyzed patients, movement responses may be best avoided by ensuring a relatively deep hypnotic level with propofol. HR/RRI, fEMG, and PPG dicrotic notch amplitude are potential indicators of nociception during anesthesia, but their performance needs to be validated in future studies. Combining information from different sources may improve the discrimination of the level of nociception.
Resumo:
The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.
Resumo:
Peanut agglutinin is a homotetrameric nonglycosylated protein. The protein has a unique open quaternary structure. Molecular dynamics simulations have been employed follow the atomistic details of its unfolding at different temperatures. The early events of the deoligomerization of the protein have been elucidated in the present study. Simulation trajectories of the monomer as well as those of the tetramer have been compared and the tetramer is found to be substantially more stable than its monomeric counterpart. The tetramer shows retention of most of its.. secondary structure but considerable loss of the tertiary structure at high temperature. e generation of a This observation impies the molten globule-like intermediate in the later stages of deoligomerization. The quaternary structure of the protein has weakened to a large extent, but none of the subunits are separated. In addition, the importance of the metal-binding to the stability of the protein structure has also been investigated. Binding of the metal ions not only enhances the local stability of the metal-ion binding loop, but also imparts a global stability to the overall structure. The dynamics of different interfaces vary significantly as probed through interface clusters. The differences are substantially enhanced at higher temperatures. The dynamics and the stability of the interfaces have been captured mainly by cluster analysis, which has provided detailed information on the thermal deoligomerization of the protein.