998 resultados para cal8402-3-Counts
Resumo:
Coming out midlife is a profound and life‐changing experience—it is an experience of self‐shattering that entails the destabilisation of identity, and of family relationships. Entailing a displacement from social insider to outsider, it is a difficult, but also exhilarating, journey of self, and sexual, discovery. This thesis is an examination of the experiences of nine women who undertook that journey. This dissertation is very much a search for understanding—for understanding how one can be lesbian, and how one can not have known, following a lifetime of heterosexual identification—as well as a search for why those questions arise in the first place. I argue that the experience of coming out midlife exposes the fundamental ambiguity of sexuality; and has a significance that ranges beyond the particularity of the participants’ experiences and speaks to the limitations of the hegemonic sexual paradigm itself. Using the theoretical lens of three diverse conceptual approaches—the dynamic systems theory of sexual fluidity; liminality; and narrative identity—to illuminate their transition, I argue that the event of coming out midlife should be viewed not merely as an atypical experience, but rather we should ask what such events can tell us about women’s sexuality in particular, and the sexual paradigm more generally. I argue that women who come out midlife challenge those dominant discourses of sexuality that would entail that women who come out midlife were either in denial of their “true” sexuality throughout their adult lives; or that they are not really lesbian now. The experiences of the women I interviewed demonstrate the inadequacy of the sexual paradigm as a framework within which to understand and research the complexity of human sexuality; they also challenge hegemonic understandings of sexuality as innate and immutable. In this thesis, I explore that challenge.
Resumo:
BACKGROUND: Cardiac surgery requiring cardiopulmonary bypass is associated with platelet activation. Because platelets are increasingly recognized as important effectors of ischemia and end-organ inflammatory injury, the authors explored whether postoperative nadir platelet counts are associated with acute kidney injury (AKI) and mortality after coronary artery bypass grafting (CABG) surgery. METHODS: The authors evaluated 4,217 adult patients who underwent CABG surgery. Postoperative nadir platelet counts were defined as the lowest in-hospital values and were used as a continuous predictor of postoperative AKI and mortality. Nadir values in the lowest 10th percentile were also used as a categorical predictor. Multivariable logistic regression and Cox proportional hazard models examined the association between postoperative platelet counts, postoperative AKI, and mortality. RESULTS: The median postoperative nadir platelet count was 121 × 10/l. The incidence of postoperative AKI was 54%, including 9.5% (215 patients) and 3.4% (76 patients) who experienced stages II and III AKI, respectively. For every 30 × 10/l decrease in platelet counts, the risk for postoperative AKI increased by 14% (adjusted odds ratio, 1.14; 95% CI, 1.09 to 1.20; P < 0.0001). Patients with platelet counts in the lowest 10th percentile were three times more likely to progress to a higher severity of postoperative AKI (adjusted proportional odds ratio, 3.04; 95% CI, 2.26 to 4.07; P < 0.0001) and had associated increased risk for mortality immediately after surgery (adjusted hazard ratio, 5.46; 95% CI, 3.79 to 7.89; P < 0.0001). CONCLUSION: The authors found a significant association between postoperative nadir platelet counts and AKI and short-term mortality after CABG surgery.
Resumo:
During recent reinvestigations in the Great Cave of Niah in Borneo, the ‘Hell Trench’ sedimentary sequence seen by earlier excavators was re-exposed. Early excavations here yielded the earliest anatomically-modern human remains in island Southeast Asia. Calibrated radiocarbon dates, pollen, algal microfossils, palynofacies, granulometry and geochemistry of the ‘Hell Trench’ sequence provide information about environmental and vegetational changes, elements of geomorphic history and information about human activity. The ‘Hell’ sediments were laid down episodically in an ephemeral stream or pool. The pollen suggests cyclically changing vegetation with forest habitats alternating with more open environments; indicating that phases with both temperatures and precipitation reduced compared with the present. These events can be correlated with global climate change sequences to produce a provisional dating framework. During some forest phases, high counts of Justicia, a plant which today colonises recently burnt forest areas, point to fire in the landscape. This may be evidence for biomass burning by humans, presumably to maintain forest-edge habitats. There is evidence from palynofacies for fire on the cave floor in the ‘Hell’ area. Since the area sampled is beyond the limit of plant growth, this is evidence for human activity. The first such evidence is during an episode with significant grassland indicators, suggesting that people may have reached the site during a climatic phase characterised by relatively open habitats ~50 ka. Thereafter, people were able to maintain a relatively consistent presence at Niah. The human use of the ‘Hell’ area seems to have intensified through time, probably because changes in the local hydrological regime made the area dryer and more suitable for human use.
Resumo:
Background: When cure is impossible, cancer treatment should focus on both length and quality of life. Maximisation of time without toxic effects could be one effective strategy to achieve both of these goals. The COIN trial assessed preplanned treatment holidays in advanced colorectal cancer to achieve this aim. Methods: COIN was a randomised controlled trial in patients with previously untreated advanced colorectal cancer. Patients received either continuous oxaliplatin and fluoropyrimidine combination (arm A), continuous chemotherapy plus cetuximab (arm B), or intermittent (arm C) chemotherapy. In arms A and B, treatment continued until development of progressive disease, cumulative toxic effects, or the patient chose to stop. In arm C, patients who had not progressed at their 12-week scan started a chemotherapy-free interval until evidence of disease progression, when the same treatment was restarted. Randomisation was done centrally (via telephone) by the MRC Clinical Trials Unit using minimisation. Treatment allocation was not masked. The comparison of arms A and B is described in a companion paper. Here, we compare arms A and C, with the primary objective of establishing whether overall survival on intermittent therapy was non-inferior to that on continuous therapy, with a predefined non-inferiority boundary of 1·162. Intention-to-treat (ITT) and per-protocol analyses were done. This trial is registered, ISRCTN27286448. Findings: 1630 patients were randomly assigned to treatment groups (815 to continuous and 815 to intermittent therapy). Median survival in the ITT population (n=815 in both groups) was 15·8 months (IQR 9·4—26·1) in arm A and 14·4 months (8·0—24·7) in arm C (hazard ratio [HR] 1·084, 80% CI 1·008—1·165). In the per-protocol population (arm A, n=467; arm C, n=511), median survival was 19·6 months (13·0—28·1) in arm A and 18·0 months (12·1—29·3) in arm C (HR 1·087, 0·986—1·198). The upper limits of CIs for HRs in both analyses were greater than the predefined non-inferiority boundary. Preplanned subgroup analyses in the per-protocol population showed that a raised baseline platelet count, defined as 400 000 per µL or higher (271 [28%] of 978 patients), was associated with poor survival with intermittent chemotherapy: the HR for comparison of arm C and arm A in patients with a normal platelet count was 0·96 (95% CI 0·80—1·15, p=0·66), versus 1·54 (1·17—2·03, p=0·0018) in patients with a raised platelet count (p=0·0027 for interaction). In the per-protocol population, more patients on continuous than on intermittent treatment had grade 3 or worse haematological toxic effects (72 [15%] vs 60 [12%]), whereas nausea and vomiting were more common on intermittent treatment (11 [2%] vs 43 [8%]). Grade 3 or worse peripheral neuropathy (126 [27%] vs 25 [5%]) and hand—foot syndrome (21 [4%] vs 15 [3%]) were more frequent on continuous than on intermittent treatment. Interpretation: Although this trial did not show non-inferiority of intermittent compared with continuous chemotherapy for advanced colorectal cancer in terms of overall survival, chemotherapy-free intervals remain a treatment option for some patients with advanced colorectal cancer, offering reduced time on chemotherapy, reduced cumulative toxic effects, and improved quality of life. Subgroup analyses suggest that patients with normal baseline platelet counts could gain the benefits of intermittent chemotherapy without detriment in survival, whereas those with raised baseline platelet counts have impaired survival and quality of life with intermittent chemotherapy and should not receive a treatment break.
Resumo:
There were three objectives to the present study: (1) compare the bladder infection rate and extent of biofilm formation for seven untreated spinal cord injured (SCI) patients and seven given prophylactic co-trimoxazole, (2) identify a level of bacterial adhesion to bladder cells which could be used to help predict symptomatic infection, and (3) determine from in vivo and in vitro studies whether fluoroquinolones were effective at penetrating bacterial biofilms. The results showed that the infection rate had not changed with the introduction of prophylaxis. However, the uropathogenic population had altered subsequent to the introduction of prophylaxis with E. coli being replaced by E. faecalis as the most common cause of infection. In 63% of the specimens from asymptomatic patients, the bacterial counts per cell were <20, while 81% of specimens from patients with at least one sign and one symptom of urinary tract infection (UTI) had > 20 adherent bacteria per bladder cell. Therefore, it is proposed that counts of > 20 bacteria adherent to sediment transitional epithelial bladder cells may be predictive of symptomatic UTI. Clinical data showed that fluoroquinolone therapy reduced the adhesion counts to <20 per cell in 63% of cases, while trimethoprim-sulfamethoxazole only did so in 44%. Further in vitro testing showed that ciprofloxacin (0.1, 0.5 and 1.0 micrograms/ml) partially or completely eradicated adherent biofilms from 92% of spinal cord injured patients' bladder cells, while ofloxacin did so in 71% cases and norfloxacin in 56%. These findings have important implications for the detection and treatment of bacteriuria in spinal cord injured patients.
Resumo:
Paramedics are trained to use specialized medical knowledge and a variety of medical procedures and pharmaceutical interventions to “save patients and prevent further damage” in emergency situations, both as members of “health-care teams” in hospital emergency departments (Swanson, 2005: 96) and on the streets – unstandardized contexts “rife with chaotic, dangerous, and often uncontrollable elements” (Campeau, 2008: 3). The paramedic’s unique skill-set and ability to function in diverse situations have resulted in the occupation becoming ever more important to health care systems (Alberta Health and Wellness, 2008: 12).
Today, prehospital emergency services, while varying, exist in every major city and many rural areas throughout North America (Paramedics Association of Canada, 2008) and other countries around the world (Roudsari et al., 2007). Services in North America, for instance, treat and/or transport 2 million Canadians (over 250,000 in Alberta alone ) and between 25 and 30 million Americans annually (Emergency Medical Services Chiefs of Canada, 2006; National EMS Research Agenda, 2001). In Canada, paramedics make up one of the largest groups of health care professionals, with numbers exceeding 20,000 (Pike and Gibbons, 2008; Paramedics Association of Canada, 2008). However, there is little known about the work practices of paramedics, especially in light of recent changes to how their work is organized, making the profession “rich with unexplored opportunities for research on the full range of paramedic work” (Campeau, 2008: 2).
This presentation reports on findings from an institutional ethnography that explored the work of paramedics and different technologies of knowledge and governance that intersect with and organize their work practices. More specifically, my tentative focus of this presentation is on discussing some of the ruling discourses central to many of the technologies used on the front lines of EMS in Alberta and the consequences of such governance practices for both the front line workers and their patients. In doing so, I will demonstrate how IE can be used to answer Rankin and Campbell’s (2006) call for additional research into “the social organization of information in health care and attention to the (often unintended) ways ‘such textual products may accomplish…ruling purposes but otherwise fail people and, moreover, obscure that failure’ (p. 182)” (cited in McCoy, 2008: 709).
Resumo:
Pollen is routinely monitored, but it is unknown whether pollen counts represent allergen exposure. We therefore simultaneously determined olive pollen and Ole e 1 in ambient air in C"ordoba, Spain, and "Evora, Portugal, using Hirst-type traps for pollen and high-volume cascade impactors for allergen. Pollen from different days released 12-fold different amounts of Ole e 1 per pollen (both locations P < 0.001). Average allergen release from pollen (pollen potency) was much higher in C"ordoba (3.9 pg Ole e 1/pollen) than in "Evora (0.8 pg Ole e 1/pollen, P = 0.004). Indeed, yearly olive pollen counts in C"ordoba were 2.4 times higher than in "Evora, but Ole e 1 concentrations were 7.6 times higher. When modeling the origin of the pollen, >40% of Ole e 1 exposure in "Evora was explained by high-potency pollen originating from the south of Spain. Thus, olive pollen can vary substantially in allergen release, even though they are morphologically identical.
Resumo:
Background: Evidence exists for a relationship between individual characteristics and both job and training performance; however relationships may not be generalizable. Little is known about the impact of therapist characteristics on performance in postgraduate therapist training programmes. Aims: The aim of this study was to investigate associations between the grades of trainee Low-Intensity and High-Intensity cognitive behavioural therapists and individual characteristics. Method: Trainee Low-Intensity (n=81) and High-Intensity (n=59) therapists completed measures of personality and cognitive ability; demographic and course grade data for participants were collected. Results: Degree classification emerged as the only variable to be significantly associated with performance across assessments and courses. Higher undergraduate degree classifications were associated with superior academic and clinical performance. Agreeableness was the only dimension of personality to be associated (positively) with clinical skill. Age was weakly and negatively associated with performance. Conclusions: Relationships between individual characteristics and training outcomes are complex and may be context specific. These results could have important implications for the selection and development of therapists for Low or High-Intensity cognitive behavioural therapy (CBT) training.
Resumo:
In this work, lipolysis, proteolysis and viscosity of ultra-high temperature (UHT) milk containing different somatic cell counts (SCC) were investigated. UHT milks were analysed on days 8, 30, 60, 90 and 120 of storage. Lipolysis as measured by free fatty acids increase, casein degradation and viscosity of UHT milk were not affected by SCC but increased during storage. A negative relationship was observed between SCC and casein as a percentage of true protein on the 120th day of storage, hence indicating that high SCC increases the proteolysis of UHT milk by the end of its shelf life.
Resumo:
The objectives of the study were to assess changes in fine root anisotropy and specific root lengths throughout the development of Eucalyptus grandis ( W. Hill ex Maiden) plantations and to establish a predictive model of root length density (RLD) from root intercept counts on trench walls. Fine root densities (<1 mm in diameter) were studied in 6-, 12-, 22-, 28-, 54-, 68- and 72-month-old E. grandis plantations established on deep Ferralsols in southern Brazil. Fine root intercepts were counted on 3 faces of 90-198 soil cubes (1 dm(3) in volume) in each stand and fine root lengths (L) were measured inside 576 soil cubes, sampled between the depths of 10 cm and 290 cm. The number of fine root intercepts was counted on one vertical face perpendicular to the planting row (N(t)), one vertical face parallel to the planting row (N(l)) and one horizontal face (N(h)), for each soil cube sampled. An overall isotropy of fine roots was shown by paired Student's t-tests between the numbers of fine roots intersecting each face of soil cubes at most stand ages and soil depths. Specific root lengths decreased with stand age in the upper soil layers and tended to increase in deep soil layers at the end of the rotation. A linear regression established between N(t) and L for all the soil cubes sampled accounted for 36% of the variability of L. Such a regression computed for mean Nt and L values at each sampling depth and stand age explained only 55% of the variability, as a result of large differences in the relationship between L and Nt depending on stand productivity. The equation RLD=1.89*LAI*N(t), where LAI was the stand leaf area index (m(2) m(-2)) and Nt was expressed as the number of root intercepts per cm(2), made it possible to predict accurately (R(2)=0.84) and without bias the mean RLDs (cm cm(-3)) per depth in each stand, for the whole data set of 576 soil cubes sampled between 2 years of age and the end of the rotation.
Resumo:
Visual estimates are generally used for counts of horn flies, Haematobia irritans (L.) and play an important role as an instrument to quantify fly populations in scientific studies. In this study, horn fly counts were performed on 30 Nelore steers in the municipality of Aracatuba, SP Brazil, from January to December 1998. Flies were counted weekly by two methods: the estimate method whereby estimates of the number of flies on one side of the animal are obtained by visual observation, and the filming method whereby images of flies from both sides of the animal are recorded with a video camera. The tape was then played on a videotape recorder coupled to a television and the flies were counted on the screen. Both methods showed variations in horn fly population density during the period studied. However, significant differences (p < 0.05) were observed between the two methods with the filming method permitting the visualization of a larger number of flies than the estimate method. In addition, the filming method permitted safe and reliable counts hours after the images were taken, with the advantage that the tape can serve as an archive for random re-counts. (C) 2002 Elsevier B.V. B.V. All rights reserved.
Resumo:
Objectives: This study compared three methods of Streptococcus mutans and Lactobacillus spp. detection in the oral cavity: saliva swab (SS)-sample of stimulated saliva collected with swab; whole saliva (WS)-sample of 2 ml of stimulated saliva; and the dental plaque method (DP)-plaque sample of all dental surfaces.Methods: Thirty children were included in this study. In the first 15 children, the SS and WS methods were carried out before the dental plaque collection, and in the following 15, the sequence was inverted to evaluate possible interference of the methods sequence. The samples were diluted and inoculated in SB20 and Rogosa agar, respectively for S. mutans and Lactobacillus spp., at 37 degrees C for 48 h.Results: the results (cfu/mL) of S. mutans were analysed by the statistical Friedman's test. The levels of Lactobacillus spp. were analysed by descriptive statistics due to the high proportion of zero counts in the culture. In the first sequence of methods, the number of S. mutans counted for the SS method was inferior to DP and WS (P < 0.05), and the results for the WS and DP methods were similar. The detection of Lactobacillus spp. was observed just by the WS (100 %) and SS (14.3 %) methods. However, in the second experimental set the number of S. mutans detected by the DP method was similar to those of the SS and WS, however, the WS method showed higher values than SS (P < 0.05). A greater number of Lactobacillus spp. was detected by the WS method (100 %), followed by SS (55.5 %) and DP (33.3 %).Conclusions: the dental plaque collection and the sample of stimulated whole saliva presented similar results in the S. mutans count. The most suitable method to detect the Lactobacillus spp. level in the oral cavity is the stimulated whole saliva method. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)