130 resultados para Atherinops affinis, total length
Resumo:
The genetic structure of rice tungro bacilliform virus (RTBV) populations within and between growing sites was analyzed in a collection of natural field isolates from different rice varieties grown in eight tungro-endemic sites of the Philippines. Total DNA extracts from 345 isolates were digested with EcoRV restriction enzyme and hybridized with a full-length probe of RTBV, a procedure shown in preliminary experiments capable of revealing high levels of polymorphism in RTBV field isolates. In the total population, 17 distinct EcoRV-based genome profiles (genotypes) were identified and used as indicators for virus diversity. Distinct sets of genotypes occurred in Isabela and North Cotabato provinces suggesting a geographic isolation of virus populations. However, among the sites in each province, there were few significant differences in the genotype compositions of virus populations. The number of genotypes detected at a site varied from two to nine with a few genotypes dominating. In general the isolates at a site persisted from season to season indicating a genetic stability for the local virus population. Over the sampling time, IRRI rice varieties, which have green leafhopper resistance genes, supported similar virus populations to those supported by other varieties, indicating that the variety of the host exerted no apparent selection pressures. Insect transmission experiments on selected RTBV field isolates showed that dramatic shifts in genotype and phenotype distributions can occur in response to host /environmental shifts.
Resumo:
We have recently demonstrated the geographic isolation of rice tungro bacilliform virus (RTBV) populations in the tungro-endemic provinces of Isabela and North Cotabato, Philippines. In this study, we examined the genetic structure of the virus populations at the tungro-outbreak sites of Lanao del Norte, a province adjacent to North Cotabato. We also analyzed the virus populations at the tungro-endemic sites of Subang, Indonesia, and Dien Khanh, Vietnam. Total DNA extracts from 274 isolates were digested with EcoRV restriction enzyme and hybridized with a full-length probe of RTBV. In the total population, 22 EcoRV-restricted genome profiles (genotypes) were identified. Although overlapping genotypes could be observed, the outbreak sites of Lanao del Norte had a genotype combination distinct from that of Subang or Dien Khanh but a genotype combination similar to that identified earlier from North Cotabato, the adjacent endemic province. Sequence analysis of the intergenic region and part of the ORF1 RTBV genome from randomly selected genotypes confirms the geographic clustering of RTBV genotypes and, combined with restriction analysis, the results suggest a fragmented spatial distribution of RTBV local populations in the three countries. Because RTBV depends on rice tungro spherical virus (RTSV) for transmission, the population dynamics of both tungro viruses were then examined at the endemic and outbreak sites within the Philippines. The RTBV genotypes and the coat protein RTSV genotypes were used as indicators for virus diversity. A shift in population structure of both viruses was observed at the outbreak sites with a reduced RTBV but increased RTSV gene diversity
Resumo:
Catheter associated urinary tract infections (CAUTI) are a worldwide problem that may lead to increased patient morbidity, cost and mortality.1e3 The literature is divided on whether there are real effects from CAUTI on length of stay or mortality. Platt4 found the costs and mortality risks to be largeyetGraves et al found the opposite.5 A reviewof the published estimates of the extra length of stay showed results between zero and 30 days.6 The differences in estimates may have been caused by the different epidemiological methods applied. Accurately estimating the effects of CAUTI is difficult because it is a time-dependent exposure. This means that standard statistical techniques, such asmatched case-control studies, tend to overestimate the increased hospital stay and mortality risk due to infection. The aim of the study was to estimate excess length of stay andmortality in an intensive care unit (ICU) due to a CAUTI, using a statistical model that accounts for the timing of infection. Data collected from ICU units in lower and middle income countries were used for this analysis.7,8 There has been little research for these settings, hence the need for this paper.
Resumo:
Background: Ambulance ramping within the Emergency Department (ED) is a common problem both internationally and in Australia. Previous research has focused on various issues associated with ambulance ramping such as access block, ED overcrowding and ambulance bypass. However, limited research has been conducted on ambulance ramping and its effects on patient outcomes. ----- ----- Methods: A case-control design was used to describe, compare and predict patient outcomes of 619 ramped (cases) vs. 1238 non-ramped (control) patients arriving to one ED via ambulance from 1 June 2007 to 31 August 2007. Cases and controls were matched (on a 1:2 basis) on age, gender and presenting problem. Outcome measures included ED length of stay and in-hospital mortality. ----- ----- Results: The median ramp time for all 1857 patients was 11 (IQR 6—21) min. Compared to nonramped patients, ramped patients had significantly longer wait time to be triaged (10 min vs. 4 min). Ramped patients also comprised significantly higher proportions of those access blocked (43% vs. 34%). No significant difference in the proportion of in-hospital deaths was identified (2%vs. 3%). Multivariate analysis revealed that the likelihood of having an ED length of stay greater than eight hours was 34% higher among patients who were ramped (OR 1.34, 95% CI 1.06—1.70, p = 0.014). In relation to in-hospital mortality age was the only significant independent predictor of mortality (p < 0.0001). ----- ----- Conclusion: Ambulance ramping is one factor that contributes to prolonged ED length of stay and adds additional strain on ED service provision. The potential for adverse patient outcomes that may occur as a result of ramping warrants close attention by health care service providers.
Resumo:
Bone development is influenced by the local mechanical environment. Experimental evidence suggests that altered loading can change cell proliferation and differentiation in chondro- and osteogenesis during endochondral ossification. This study investigated the effects of three-point bending of murine fetal metatarsal bone anlagen in vitro on cartilage differentiation, matrix mineralization and bone collar formation. This is of special interest because endochondral ossification is also an important process in bone healing and regeneration. Metatarsal preparations of 15 mouse fetuses stage 17.5 dpc were dissected en bloc and cultured for 7 days. After 3 days in culture to allow adherence they were stimulated 4 days for 20 min twice daily by a controlled bending of approximately 1000-1500 microstrain at 1 Hz. The paraffin-embedded bone sections were analyzed using histological and histomorphometrical techniques. The stimulated group showed an elongated periosteal bone collar while the total bone length was not different from controls. The region of interest (ROI), comprising the two hypertrophic zones and the intermediate calcifying diaphyseal zone, was greater in the stimulated group. The mineralized fraction of the ROI was smaller in the stimulated group, while the absolute amount of mineralized area was not different. These results demonstrate that a new device developed to apply three-point bending to a mouse metatarsal bone culture model caused an elongation of the periosteal bone collar, but did not lead to a modification in cartilage differentiation and matrix mineralization. The results corroborate the influence of biophysical stimulation during endochondral bone development in vitro. Further experiments with an altered loading regime may lead to more pronounced effects on the process of endochondral ossification and may provide further insights into the underlying mechanisms of mechanoregulation which also play a role in bone regeneration.
Resumo:
Despite many arguments to the contrary, the three-act story structure, as propounded and refined by Hollywood continues to dominate the blockbuster and independent film markets. Recent successes in post-modern cinema could indicate new directions and opportunities for low-budget national cinemas.
Resumo:
Background Total hip arthroplasty carried out using cemented modular-neck implants provides the surgeon with greater intra-operative flexibility and allows more controlled stem positioning. Methods In this study, finite element models of a whole femur implanted with either the Exeter or with a new cemented modular-neck total hip arthroplasty (separate, neck and stem components) were developed. The changes in bone and cement mantle stress/strain were assessed for varying amounts of neck offset and version angle for the modular-neck device for two simulated physiological load cases: walking and stair climbing. Since the Exeter is the gold standard for polished cemented total hip arthroplasty stem design, bone and cement mantle stresses/strains in the modular-neck finite element models were compared with finite element results for the Exeter. Findings For the two physiological load cases, stresses and strains in the bone and cement mantle were similar for all modular-neck geometries. These results were comparable to the bone and cement mechanics surrounding the Exeter. These findings suggest that the Exeter and the modular neck device distribute stress to the surrounding bone and cement in a similar manner. Interpretation It is anticipated that the modular-neck device will have a similar short-term clinical performance to that of the Exeter, with the additional advantages of increased modularity.
Resumo:
Activated protein C resistance (APCR), the most common risk factor for venous thrombosis, is the result of a G to A base substitution at nucleotide 1691 (R506Q) in the factor V gene. Current techniques to detect the factor V Leiden mutation, such as determination of restriction length polymorphisms, do not have the capacity to screen large numbers of samples in a rapid, cost- effective test. The aim of this study was to apply the first nucleotide change (FNC) technology, to the detection of the factor V Leiden mutation. After preliminary amplification of genomic DNA by polymerase chain reaction (PCR), an allele-specific primer was hybridised to the PCR product and extended using fluorescent terminating dideoxynucleotides which were detected by colorimetric assay. Using this ELISA-based assay, the prevalence of the factor V Leiden mutation was determined in an Australian blood donor population (n = 500). A total of 18 heterozygotes were identified (3.6%) and all of these were confirmed with conventional MnlI restriction digest. No homozygotes for the variant allele were detected. We conclude from this study that the frequency of 3.6% is compatible with others published for Caucasian populations. In addition, the FNC technology shows promise as the basis for a rapid, automated DNA based test for factor V Leiden.
Resumo:
The works depicted two ostensibly plaster figures 'cocooned' in protective overalls. The pose of both figures had a sense of instability, balancing improbably due to internal weights. This teetering, arching quality, combined with the empty sleeves of the overalls, made reference to the Rodin's Balzac and its aura of heroic subjectivity. As the Tyvek suits depicted in the works are a common part of my studio paraphernalia, these works sought to draw a line between these two opposing aspects of the subjectivity of the artist - the transcendent and the quotidian. The works were shown as part of ‘The Day the Machine Started’ for Dianne Tanzer Gallery + Projects at the 2010 Melbourne Art Fair. The works received citations in The Age and The Australian newspapers.
Resumo:
Understanding the relationship between diet, physical activity and health in humans requires accurate measurement of body composition and daily energy expenditure. Stable isotopes provide a means of measuring total body water and daily energy expenditure under free-living conditions. While the use of isotope ratio mass spectrometry (IRMS) for the analysis of 2H (Deuterium) and 18O (Oxygen-18) is well established in the field of human energy metabolism research, numerous questions remain regarding the factors which influence analytical and measurement error using this methodology. This thesis was comprised of four studies with the following emphases. The aim of Study 1 was to determine the analytical and measurement error of the IRMS with regard to sample handling under certain conditions. Study 2 involved the comparison of TEE (Total daily energy expenditure) using two commonly employed equations. Further, saliva and urine samples, collected at different times, were used to determine if clinically significant differences would occur. Study 3 was undertaken to determine the appropriate collection times for TBW estimates and derived body composition values. Finally, Study 4, a single case study to investigate if TEE measures are affected when the human condition changes due to altered exercise and water intake. The aim of Study 1 was to validate laboratory approaches to measure isotopic enrichment to ensure accurate (to international standards), precise (reproducibility of three replicate samples) and linear (isotope ratio was constant over the expected concentration range) results. This established the machine variability for the IRMS equipment in use at Queensland University for both TBW and TEE. Using either 0.4mL or 0.5mL sample volumes for both oxygen-18 and deuterium were statistically acceptable (p>0.05) and showed a within analytical variance of 5.8 Delta VSOW units for deuterium, 0.41 Delta VSOW units for oxygen-18. This variance was used as “within analytical noise” to determine sample deviations. It was also found that there was no influence of equilibration time on oxygen-18 or deuterium values when comparing the minimum (oxygen-18: 24hr; deuterium: 3 days) and maximum (oxygen-18: and deuterium: 14 days) equilibration times. With regard to preparation using the vacuum line, any order of preparation is suitable as the TEE values fall within 8% of each other regardless of preparation order. An 8% variation is acceptable for the TEE values due to biological and technical errors (Schoeller, 1988). However, for the automated line, deuterium must be assessed first followed by oxygen-18 as the automated machine line does not evacuate tubes but merely refills them with an injection of gas for a predetermined time. Any fractionation (which may occur for both isotopes), would cause a slight elevation in the values and hence a lower TEE. The purpose of the second and third study was to investigate the use of IRMS to measure the TEE and TBW of and to validate the current IRMS practices in use with regard to sample collection times of urine and saliva, the use of two TEE equations from different research centers and the body composition values derived from these TEE and TBW values. Following the collection of a fasting baseline urine and saliva sample, 10 people (8 women, 2 men) were dosed with a doubly labeled water does comprised of 1.25g 10% oxygen-18 and 0.1 g 100% deuterium/kg body weight. The samples were collected hourly for 12 hrs on the first day and then morning, midday, and evening samples were collected for the next 14 days. The samples were analyzed using an isotope ratio mass spectrometer. For the TBW, time to equilibration was determined using three commonly employed data analysis approaches. Isotopic equilibration was reached in 90% of the sample by hour 6, and in 100% of the sample by hour 7. With regard to the TBW estimations, the optimal time for urine collection was found to be between hours 4 and 10 as to where there was no significant difference between values. In contrast, statistically significant differences in TBW estimations were found between hours 1-3 and from 11-12 when compared with hours 4-10. Most of the individuals in this study were in equilibrium after 7 hours. The TEE equations of Prof Dale Scholler (Chicago, USA, IAEA) and Prof K.Westerterp were compared with that of Prof. Andrew Coward (Dunn Nutrition Centre). When comparing values derived from samples collected in the morning and evening there was no effect of time or equation on resulting TEE values. The fourth study was a pilot study (n=1) to test the variability in TEE as a result of manipulations in fluid consumption and level of physical activity; the magnitude of change which may be expected in a sedentary adult. Physical activity levels were manipulated by increasing the number of steps per day to mimic the increases that may result when a sedentary individual commences an activity program. The study was comprised of three sub-studies completed on the same individual over a period of 8 months. There were no significant changes in TBW across all studies, even though the elimination rates changed with the supplemented water intake and additional physical activity. The extra activity may not have sufficiently strenuous enough and the water intake high enough to cause a significant change in the TBW and hence the CO2 production and TEE values. The TEE values measured show good agreement based on the estimated values calculated on an RMR of 1455 kcal/day, a DIT of 10% of TEE and activity based on measured steps. The covariance values tracked when plotting the residuals were found to be representative of “well-behaved” data and are indicative of the analytical accuracy. The ratio and product plots were found to reflect the water turnover and CO2 production and thus could, with further investigation, be employed to identify the changes in physical activity.
Resumo:
Older adults, especially those acutely ill, are vulnerable to developing malnutrition due to a range of risk factors. The high prevalence and extensive consequences of malnutrition in hospitalised older adults have been reported extensively. However, there are few well-designed longitudinal studies that report the independent relationship between malnutrition and clinical outcomes after adjustment for a wide range of covariates. Acutely ill older adults are exceptionally prone to nutritional decline during hospitalisation, but few reports have studied this change and impact on clinical outcomes. In the rapidly ageing Singapore population, all this evidence is lacking, and the characteristics associated with the risk of malnutrition are also not well-documented. Despite the evidence on malnutrition prevalence, it is often under-recognised and under-treated. It is therefore crucial that validated nutrition screening and assessment tools are used for early identification of malnutrition. Although many nutrition screening and assessment tools are available, there is no universally accepted method for defining malnutrition risk and nutritional status. Most existing tools have been validated amongst Caucasians using various approaches, but they are rarely reported in the Asian elderly and none has been validated in Singapore. Due to the multiethnicity, cultural, and language differences in Singapore older adults, the results from non-Asian validation studies may not be applicable. Therefore it is important to identify validated population and setting specific nutrition screening and assessment methods to accurately detect and diagnose malnutrition in Singapore. The aims of this study are therefore to: i) characterise hospitalised elderly in a Singapore acute hospital; ii) describe the extent and impact of admission malnutrition; iii) identify and evaluate suitable methods for nutritional screening and assessment; and iv) examine changes in nutritional status during admission and their impact on clinical outcomes. A total of 281 participants, with a mean (+SD) age of 81.3 (+7.6) years, were recruited from three geriatric wards in Tan Tock Seng Hospital over a period of eight months. They were predominantly Chinese (83%) and community-dwellers (97%). They were screened within 72 hours of admission by a single dietetic technician using four nutrition screening tools [Tan Tock Seng Hospital Nutrition Screening Tool (TTSH NST), Nutritional Risk Screening 2002 (NRS 2002), Mini Nutritional Assessment-Short Form (MNA-SF), and Short Nutritional Assessment Questionnaire (SNAQ©)] that were administered in no particular order. The total scores were not computed during the screening process so that the dietetic technician was blinded to the results of all the tools. Nutritional status was assessed by a single dietitian, who was blinded to the screening results, using four malnutrition assessment methods [Subjective Global Assessment (SGA), Mini Nutritional Assessment (MNA), body mass index (BMI), and corrected arm muscle area (CAMA)]. The SGA rating was completed prior to computation of the total MNA score to minimise bias. Participants were reassessed for weight, arm anthropometry (mid-arm circumference, triceps skinfold thickness), and SGA rating at discharge from the ward. The nutritional assessment tools and indices were validated against clinical outcomes (length of stay (LOS) >11days, discharge to higher level care, 3-month readmission, 6-month mortality, and 6-month Modified Barthel Index) using multivariate logistic regression. The covariates included age, gender, race, dementia (defined using DSM IV criteria), depression (defined using a single question “Do you often feel sad or depressed?”), severity of illness (defined using a modified version of the Severity of Illness Index), comorbidities (defined using Charlson Comorbidity Index, number of prescribed drugs and admission functional status (measured using Modified Barthel Index; MBI). The nutrition screening tools were validated against the SGA, which was found to be the most appropriate nutritional assessment tool from this study (refer section 5.6) Prevalence of malnutrition on admission was 35% (defined by SGA), and it was significantly associated with characteristics such as swallowing impairment (malnourished vs well-nourished: 20% vs 5%), poor appetite (77% vs 24%), dementia (44% vs 28%), depression (34% vs 22%), and poor functional status (MBI 48.3+29.8 vs 65.1+25.4). The SGA had the highest completion rate (100%) and was predictive of the highest number of clinical outcomes: LOS >11days (OR 2.11, 95% CI [1.17- 3.83]), 3-month readmission (OR 1.90, 95% CI [1.05-3.42]) and 6-month mortality (OR 3.04, 95% CI [1.28-7.18]), independent of a comprehensive range of covariates including functional status, disease severity and cognitive function. SGA is therefore the most appropriate nutritional assessment tool for defining malnutrition. The TTSH NST was identified as the most suitable nutritional screening tool with the best diagnostic performance against the SGA (AUC 0.865, sensitivity 84%, specificity 79%). Overall, 44% of participants experienced weight loss during hospitalisation, and 27% had weight loss >1% per week over median LOS 9 days (range 2-50). Wellnourished (45%) and malnourished (43%) participants were equally prone to experiencing decline in nutritional status (defined by weight loss >1% per week). Those with reduced nutritional status were more likely to be discharged to higher level care (adjusted OR 2.46, 95% CI [1.27-4.70]). This study is the first to characterise malnourished hospitalised older adults in Singapore. It is also one of the very few studies to (a) evaluate the association of admission malnutrition with clinical outcomes in a multivariate model; (b) determine the change in their nutritional status during admission; and (c) evaluate the validity of nutritional screening and assessment tools amongst hospitalised older adults in an Asian population. Results clearly highlight that admission malnutrition and deterioration in nutritional status are prevalent and are associated with adverse clinical outcomes in hospitalised older adults. With older adults being vulnerable to risks and consequences of malnutrition, it is important that they are systematically screened so timely and appropriate intervention can be provided. The findings highlighted in this thesis provide an evidence base for, and confirm the validity of the current nutrition screening and assessment tools used among hospitalised older adults in Singapore. As the older adults may have developed malnutrition prior to hospital admission, or experienced clinically significant weight loss of >1% per week of hospitalisation, screening of the elderly should be initiated in the community and continuous nutritional monitoring should extend beyond hospitalisation.