989 resultados para dressing percentage
Resumo:
The long-term outcome of 25 patients with bimalleolar fractures of the ankle was assessed ten to fourteen years following their fractures using the Phillips scoring system. All patients had undergone open reduction and anatomical internal fixation (as described in their operative notes in the medical records). 52 % of patients had a good or excellent overall outcome while 24% had a poor overall outcome. This study has the longest follow-up period (10 to 14 years) to date on the outcomes of internal fixation of bimalleolar ankle fractures and demonstrates a higher percentage of poorer outcomes than has been previously described. This trend appears to be predictable as other studies with shorter term follow-up have already established a trend of increasing radiological evidence of post-traumatic arthritis with successively longer-term outcome reports.
Resumo:
A commercially available smoke-water solution (Seed Starter (R)) stimulated the germination of caryopses and intact florets of Avena fatua L. The solution was most effective when diluted (5-50%) and presented to intact or dehulled grain that had received a short period of dry after-ripening. It was less effective when applied at full strength or to grains that had been freshly harvested. The same stimulatory effect was observed in par-fly after-ripened caryopses of nine different wild oat biotypes obtained from three different cropping regions of the world. When freshly harvested caryopses were re-tested with the commercial solution (100%) for just 7 days prior to placement on to distilled water, a much higher germination percentage was possible than seen with continuous smoke-water incubation. The stimulatory ability of smoke water was more closely matched to that of gibberellic acid than to potassium nitrate, which had little or no effect on freshly harvested caryopses. The smoke-water solution (5-100%) was tested on the germination of 18 other cool temperate arable weed species. All monocotyledonous species tested (viz. Avena sterilis ssp. ludoviciana L., Alopecurus myosuroides, Sorghum halepense, Phalaris paradoxa) responded positively, while those of the dicotyledonous species were either strongly stimulated (greater than or equal to 40% stimulation Malva neglecta), moderately stimulated (greater than or equal to 20% stimulation Galium aparine, Veronica persica), slightly stimulated (Polygonum persicaria, P pennsylvanicum, Fallopia convolvulus), unaffected (P. aviculare, Sinapis arvensis, Heracleum sphondylium, Angelica sylvestris, Mercurialis annua, Veronica hederifolia) or inhibited (Lamium purpureum). The optimal concentrations required to stimulate germination of the monocotyledonous species were similar to those observed for A. fatua (5-10%). However, for the dicotyledonous species slightly stronger solutions were required (10-20%). When the unaffected species were retested using a 10-day pre-chilling treatment, smoke water showed a small promotive response in three (S. arvensis, P. aviculare and V hederifolia) of the six species. When four different smoke-water solutions (Seed Starter (R), Regen 2000 (R), charred-wood solution and wheat-straw solution) were tested on two representative species (A. fatua and M. neglecta), three formulations were effective in promoting the germination of both species, while the fourth (charred-wood solution) was only active on A. fatua. The active concentrations were different for the four solutions. Three solutions were active in the 2-20% dilution range, while the fourth (Regen 2000 (R)) was only active in the 1-2% dilution range and was inhibitory at higher concentrations. These observations are discussed in the context that smoke may play an important ecological role in the management and control of introduced weeds in native and arable communities.
Resumo:
We compared changes in muscle fibre composition and muscle strength indices following a 10 week isokinetic resistance training programme consisting of fast (3.14 rad(.)s(-1)) or slow (0.52 rad(.)s(-1)) velocity eccentric muscle contractions. A group of 20 non-resistance trained subjects were assigned to a FAST (n = 7), SLOW (n = 6) or non-training CONTROL (n = 7) group. A unilateral training protocol targeted the elbow flexor muscle group and consisted of 24 maximal eccentric isokinetic contractions (four sets of six repetitions) performed three times a week for 10 weeks. Muscle biopsy samples were obtained from the belly of the biceps brachii. Isometric torque and concentric and eccentric torque at 0.52 and 3.14 rad(.)s(-1) were examined at 0, 5 and 10 weeks. After 10 weeks, the FAST group demonstrated significant [mean (SEM)] increases in eccentric [29.6 (6.4)%] and concentric torque [27.4 (7.3) %] at 3.14 rad(.)s(-1), isometric torque [21.3 (4.3)%] and eccentric torque [25.2 (7.2) %] at 0.52 rad(.)s(-1). The percentage of type I fibres in the FAST group decreased from [53.8 (6.6)% to 39.1 (4.4)%] while type lib fibre percentage increased from [5.8 (1.9)% to 12.9 (3.3)%; P < 0.05]. In contrast. the SLOW group did not experience significant changes in muscle fibre type or muscle torque. We conclude that neuromuscular adaptations to eccentric training stimuli may be influenced by differences in the ability to cope with chronic exposure to relatively fast and slow eccentric contraction velocities. Possible mechanisms include greater cumulative damage to contractile tissues or stress induced by slow eccentric muscle contractions.
Resumo:
Performance in sprint exercise is determined by the ability to accelerate, the magnitude of maximal velocity and the ability to maintain velocity against the onset of fatigue. These factors are strongly influenced by metabolic and anthropometric components. Improved temporal sequencing of muscle activation and/or improved fast twitch fibre recruitment may contribute to superior sprint performance. Speed of impulse transmission along the motor axon may also have implications on sprint performance. Nerve conduction velocity (NCV) has been shown to increase in response to a period of sprint training. However, it is difficult to determine if increased NCV is likely to contribute to improved sprint performance. An increase in motoneuron excitability, as measured by the Hoffman reflex (H-reflex), has been reported to produce a more powerful muscular contraction, hence maximising motoneuron excitability would be expected to benefit sprint performance. Motoneuron excitability can be raised acutely by an appropriate stimulus with obvious implications for sprint performance. However, at rest reflex has been reported to be lower in athletes trained for explosive events compared with endurance-trained athletes. This may be caused by the relatively high, fast twitch fibre percentage and the consequent high activation thresholds of such motor units in power-trained populations. In contrast, stretch reflexes appear to be enhanced in sprint athletes possibly because of increased muscle spindle sensitivity as a result of sprint training. With muscle in a contracted state, however, there is evidence to suggest greater reflex potentiation among both sprint and resistance-trained populations compared with controls. Again this may be indicative of the predominant types of motor units in these populations, but may also mean an enhanced reflex contribution to force production during running in sprint-trained athletes. Fatigue of neural origin both during and following sprint exercise has implications with respect to optimising training frequency and volume. Research suggests athletes are unable to maintain maximal firing frequencies for the full duration of, for example, a 100m sprint. Fatigue after a single training session may also have a neural manifestation with some athletes unable to voluntarily fully activate muscle or experiencing stretch reflex inhibition after heavy training. This may occur in conjunction with muscle damage. Research investigating the neural influences on sprint performance is limited. Further longitudinal research is necessary to improve our understanding of neural factors that contribute to training-induced improvements in sprint performance.
Resumo:
Sequences from the tuf gene coding for the elongation factor EF-Tu were amplified and sequenced from the genomic DNA of Pirellula marina and Isosphaera pallida, two species of bacteria within the order Planctomycetales. A near-complete (1140-bp) sequence was obtained from Pi. marina and a partial (759-bp) sequence was obtained for I. pallida. Alignment of the deduced Pi. marina EF-Tu amino acid sequence against reference sequences demonstrated the presence of a unique Il-amino acid sequence motif not present in any other division of the domain Bacteria. Pi. marina shared the highest percentage amino acid sequence identity with I. pallida but showed only a low percentage identity with other members of the domain Bacteria. This is consistent with the concept of the planctomycetes as a unique division of the Bacteria. Neither primary sequence comparison of EF-Tu nor phylogenetic analysis supports any close relationship between planctomycetes and the chlamydiae, which has previously been postulated on the basis of 16S rRNA. Phylogenetic analysis of aligned EF-Tu amino acid sequences performed using distance, maximum-parsimony, and maximum likelihood approaches yielded contradictory results with respect to the position of planctomycetes relative to other bacteria, It is hypothesized that long-branch attraction effects due to unequal evolutionary rates and mutational saturation effects may account for some of the contradictions.
Resumo:
The Lewis dwarf (DW) rat was used as a model to test the hypothesis that growth hormone (GH) is permissive for new bone formation induced by mechanical loading in vivo. Adult female Lewis DW rats aged 6.2 +/- 0.1 months (187 +/- 18 g) were allocated to four vehicle groups (DW), four GH treatment groups at 32.5 mug/100 g body mass (DWGH1), and four GH treatment groups at 65 mug/100 g (DWGH2). Saline vehicle or GH was injected intraperitoneally (ip) at 6:30 p.m. and 6:30 a.m. before mechanical loading of tibias at 7:30 a.m. A single period of 300 cycles of four-point bending was applied to right tibias at 2.0 Hz, and magnitudes of 24, 29, 38, or 48N were applied. Separate strain gauge analyses in 5 DW rats validated the selection of loading magnitudes. After loading, double-label histomorphometry was used to assess bone formation at the periosteal surface (Ps.S) and endocortical surface (Ec.S) of tibias. Comparing left (unloaded) tibias among groups, GH treatment had no effect on bone formation. Bone formation in tibias in DW rats was insensitive to mechanical loading. At the Ec.S, mechanically induced lamellar bone formation increased in the DWGH2 group loaded at 48N (p < 0.05), and no significant increases in bone formation were observed among other groups. The percentage of tibias expressing woven bone formation (Wo.B) at the Ps.S was significantly greater in the DWGH groups compared with controls (p < 0.05). We concluded that GH influences loading-related bone formation in a permissive manner and modulates the responsiveness of bone tissue to mechanical stimuli by changing thresholds for bone formation.
Resumo:
Many cases of potentially curable primary aldosteronism are currently likely to be diagnosed as essential hypertension unless screening tests based on suppression of renin are tarried out in all hypertensive patients. More than half of the patients with primary aldosteronism detected in this way have normal circulating potassium levels, so measurement of potassium is not enough to exclude primary aldosteronism. When primary aldosteronism is diagnosed, fewer than one-third of patients are suitable for surgery as initial treatment, but this still represents a significant percentage of hypertensive patients. After excluding glucocorticoid-suppressible primary aldosteronism, adrenal venous sampling is essential to detect unilateral production of aldosterone and diagnose angiotensin-responsive aldosterone-producing adenoma. One cannot rely on the computed tomography scan. If all hypertensive patients are screened for primary aldosteronism and the workup is continued methodically in those with a positive screening test, patients with unilateral overproduction of aldosterone who potentially can be cured surgically are not denied the possibility of cure.
Resumo:
A field experiment compared two rice (Oryza sativa L.) cropping systems: paddy or raised beds with continuous furrow irrigation; and trialled four cultivars: Starbonnet, Lemont, Amaroo and Ceysvoni, and one test line YRL39; that may vary in adaptation to growth on raised beds. The grain yield of rice ranged from 740 to 1250 g/m(2) and was slightly greater in paddy than on raised beds. Although there were early growth responses to fertilizer nitrogen on raised beds, the crop nitrogen content at maturity mostly exceeded 20 g/m(2) in both systems, so nitrogen was unlikely to have limited yield. Ceysvoni yielded best in both systems, a result of good post-anthesis growth and larger grain size, although its whole-grain mill-out percentage was poor relative to the other cultivars. Starbonnet and Lemont yielded poorly on raised beds, associated with too few tillers and too much leaf area. When grown on raised beds all cultivars experienced a delay in anthesis resulting in more tillers, leaf area and dry weight at anthesis, and probably a greater yield potential. The growth of rice after anthesis, however, was similar on raised beds and in paddy, so reductions in harvest index and grain size on raised beds were recorded. The data indicated that water supply was not a major limitation to rice growth on raised beds, but slower crop development was an issue that would affect the use of raised beds in a cropping system, especially in rice-growing areas where temperatures are too cool for optimal crop development. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
New Zealand is generally thought to have been physically isolated from the rest of the world for over 60 million years. But physical isolation may not mean biotic isolation, at least on the time scale of millions of years. Are New Zealand's present complement of plants the direct descendants of what originally rafted from Gondwana? Or has there been total extinction of this initial flora with replacement through long-distance dispersal (a complete biotic turnover)? These are two possible extremes which have come under recent discussion. Can the fossil record be used to decide the relative importance of the two endpoints, or is it simply too incomplete and too dependent on factors of chance? This paper suggests two approaches to the problem-the use of statistics to apply levels of confidence to first appearances in the fossil record and the analysis of trends based on the entire palynorecord. Statistics can suggest that the first appearance of a taxon was after New Zealand broke away from Gondwana-as long as the first appearance in the record was not due to an increase in biomass from an initially rare state. Two observations can be drawn from the overall palynorecord that are independent of changes in biomass: (1) The first appearance of palynotaxa common to both Australia and New Zealand is decidedly non-random. Most taxa occur first in Australia. This suggests a bias in air or water transport from west to east. (2) The percentage of endemic palynospecies in New Zealand shows no simple correlation with the time New Zealand drifted into isolation. The conifer macrorecord also hints at complete turnover since the Cretaceous.
Resumo:
Effluent water from shrimp ponds typically contains elevated concentrations of dissolved nutrients and suspended particulates compared to influent water. Attempts to improve effluent water quality using filter feeding bivalves and macroalgae to reduce nutrients have previously been hampered by the high concentration of clay particles typically found in untreated pond effluent. These particles inhibit feeding in bivalves and reduce photosynthesis in macroalgae by increasing effluent turbidity. In a small-scale laboratory study, the effectiveness of a three-stage effluent treatment system was investigated. In the first stage, reduction in particle concentration occurred through natural sedimentation. In the second stage, filtration by the Sydney rock oyster, Saccostrea commercialis (Iredale and Roughley), further reduced the concentration of suspended particulates, including inorganic particles, phytoplankton, bacteria, and their associated nutrients. In the final stage, the macroalga, Gracilaria edulis (Gmelin) Silva, absorbed dissolved nutrients. Pond effluent was collected from a commercial shrimp farm, taken to an indoor culture facility and was left to settle for 24 h. Subsamples of water were then transferred into laboratory tanks stocked with oysters and maintained for 24 h, and then transferred to tanks containing macroalgae for another 24 h. Total suspended solid (TSS), chlorophyll a, total nitrogen (N), total phosphorus (P), NH4+, NO3-, and PO43-, and bacterial numbers were compared before and after each treatment at: 0 h (initial); 24 h (after sedimentation); 48 h (after oyster filtration); 72 h (after macroalgal absorption). The combined effect of the sequential treatments resulted in significant reductions in the concentrations of all parameters measured. High rates of nutrient regeneration were observed in the control tanks, which did not contain oysters or macroalgae. Conversely, significant reductions in nutrients and suspended particulates after sedimentation and biological treatment were observed. Overall, improvements in water quality (final percentage of the initial concentration) were as follows: TSS (12%); total N (28%); total P (14%); NH4+ (76%); NO3- (30%); PO43-(35%); bacteria (30%); and chlorophyll a (0.7%). Despite the probability of considerable differences in sedimentation, filtration and nutrient uptake rates when scaled to farm size, these results demonstrate that integrated treatment has the potential to significantly improve water quality of shrimp farm effluent. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
The effects of the mode of exposure of second instar Colorado potato beetles to Beauveria bassiana on conidia acquisition and resulting mortality were investigated in laboratory studies. Larvae sprayed directly with a B, bassiana condial suspension, larvae exposed to B, bassiana-treated foliage, and larvae both sprayed and exposed to treated foliage experienced 76, 34, and 77% mortality, respectively. The total number of conidia and the proportion of germinating conidia were measured over time for four sections of the insect body: the ventral surface of the head (consisting mostly of ventral mouth parts), the ventral abdominal surface, the dorsal abdominal surface, and the legs. From observations at 24 and 36 h posttreatment, mean totals of 161.1 conidia per insect were found on sprayed larvae, 256.1 conidia on larvae exposed only to treated foliage, and 408.3 conidia on larvae both sprayed and exposed to treated foliage, On sprayed larvae, the majority of conidia were found on the dorsal abdominal surface, whereas conidia were predominantly found in the ventral abdominal surface and mouth parts on larvae exposed to treated foliage, Between 24 and 36 h postinoculation the percentage of conidia germinating on sprayed larvae increased slightly from 80 to 84%), On the treated foliage, the percentage of germinated conidia on larvae increased from 35% at 24 h to 50% at 36 h posttreatment, Conidia germination on sprayed larvae on treated foliage was 65% at 24 h and 75% at 36 h posttreatment, It is likely that the gradual acquisition of conidia derived from the continuous exposure to B. bassiana inoculum on the foliar surface was responsible for the increase in germination over time on larvae exposed to treated foliage, The density and germination of conidia were observed 0, 4, 8, 12, 16, 20, and 24 h after being sprayed with or dipped in conidia suspensions or exposing insects to contaminated foliage, Conidia germinated twice as fast on sprayed insects as with any other treatment within the first 12 h, This faster germination may be due to the pressure of the sprayer enhancing conidial lodging on cuticular surfaces. (C) 2001 Academic Press.
Resumo:
Objective: To assess hospital prescribing of lipid-lowering agents in a tertiary hospital, and examine continuation of, or changes to, such therapy in the 6-18 months following discharge. Design: Retrospective data extraction from the hospital records of patients admitted from October 1998 to April 1999. These patients and their general practitioners were then contacted to obtain information about ongoing management after discharge. Setting: Tertiary public hospital and community. Participants: 352 patients admitted to hospital with acute myocardial infarction or unstable angina, and their GPs. Main outcome measures: Percentage of eligible patients discharged on lipid-lowering therapy and percentage of patients continuing or starting such therapy 6-18 months after discharge. Results: 10% of inpatients with acute coronary syndromes did not have lipid-level estimations performed or arranged during admission. Documentation of lipid levels in discharge summaries was poor. Eighteen per cent of patients with a total serum cholesterol level greater than 5.5 mmol/L did not receive a discharge prescription for a cholesterol-lowering agent. Compliance with treatment on follow-up was 88% in the group discharged on treatment. However, at follow-up, 70% of patients discharged without therapy had not been commenced on lipid-lowering treatment by their GPs. Conclusions: Prescribing of lipid-lowering therapy for secondary prevention following acute coronary syndromes remains suboptimal. Commencing treatment in hospital is likely to result in continuing therapy in the community. Better communication of lipid-level results, treatment and treatment aims between hospitals and GPs might encourage optimal treatment practices.
Resumo:
Immunity induced by the 19-kDa fragment of merozoite surface protein 1 is dependent on CD4(+) Th cells. However, we found that adoptively transferred CFSE-labeled Th cells specific for an epitope on Plasmodium yoelii 19-kDa fragment of merozoite surface protein 1 (peptide (p)24), but not OVA-specific T cells, were deleted as a result of P. yoelii infection. As a result of infection, spleen cells recovered from infected p24-specific T cell-transfused mice demonstrated reduced response to specific Ag. A higher percentage of CFSE-labeled p24-specific T cells stained positive with annexin and anti-active caspase-3 in infected compared with uninfected mice, suggesting that apoptosis contributed to deletion of p24-specific T cells during infection. Apoptosis correlated with increased percentages of p24-specific T cells that stained positive for Fas from infected mice, suggesting that P. yoelii-induced apoptosis is, at least in part, mediated by Fas. However, bystander cells of other specificities also showed increased Fas expression during infection, suggesting that Fas expression alone is not sufficient for apoptosis. These data have implications for the development of immunity in the face of endemic parasite exposure.
Resumo:
The objective of this study is to compare the accuracy of sonographic estimation of fetal weight of macrosomic babies in diabetic vs non-diabetic pregnancies. Ali babies weighing 4000 g or more at birth, and who had ultrasound scans performed within one week of delivery were included in this retrospective study. Pregnancies with diabetes mellitus were compared to those without diabetes mellitus. The mean simple error (actual birthweight - estimated fetal weight); mean standardised absolute error (absolute value of simple error (g)/actual birthweight (kg)); and the percentage of estimated birthweight falling within 15% of the actual birthweight between the two groups were compared. There were 9516 deliveries during the study period. Of this total 1211 (12.7 %) babies weighed 4000 g or more. A total of 56 non-diabetic pregnancies and 19 diabetic pregnancies were compared. The average sonographic estimation of fetal weight in diabetic pregnancies was 8 % less than the actual birthweight, compared to 0.2 % in the non-diabetic group (p < 0.01). The estimated fetal weight was within 15% of the birthweight in 74 % of the diabetic pregnancies, compared to 93 % of the non-diabetic pregnancies (p < 0.05). In the diabetic group, 26.3 % of the birthweights were underestimated by more than 15 %, compared to 5.4 % in the non-diabetic group (p < 0.05). In conclusion, the prediction accuracy of fetal weight estimation using standard formulae in macrosomic fetuses is significantly worse in diabetic pregnancies compared to non-diabetic pregnancies. When sonographic fetal weight estimation is used to influence the mode of delivery for diabetic women, a more conservative cut-off needs to be considered.
Resumo:
Objective To assess the accuracy of intra-operative frozen section reports at identifying the features of high risk uterine disease compared with final histopathology. Design Retrospective study. Methods The records, of 460 patients with uterine cancer registered with the Queensland Centre for Gynaecological Cancer between January 1, 1996 and December 31, 1998 were reviewed. Intra-operative frozen section was undertaken in 260 patients with endometrial adenocarcinoma. Frozen section pathology was compared with the final histopathology reports. Inter-observer reliability was assessed using percentage agreement and kappa statistics. Clinical notes were also reviewed to determine if errors resulted in sub-optimal patient care. Results Respectively, tumour grade and depth of myometrial invasion were accurately reported in 88.6% of cases (expected 61.5%, Kappa 0.70) and 94.7% (expected 53.8%, Kappa 0.89). Errors were predominantly attributable to difficulties with respect to the interpretation of tumour grade. The error resulted in the patient receiving sub-optimal surgical management in only I I cases (5.3%) Conclusion Frozen section is accurate at identifying the features of high risk uterine disease in the setting of endometrial cancer and can play an important role in directing primary operative management.