908 resultados para one element per member
Resumo:
Aim - To evaluate the reproducibility of the background fundus autofluorescence measurements obtained using a confocal scanning laser ophthalmoscope. Methods - 10 normal volunteers and 10 patients with retinal disease were included in the study. One eye per subject was chosen randomly. Five images of the same eye of each individual were obtained, after pupillary dilatation, by two investigators using a confocal scanning laser ophthalmoscope. Background fundus autofluorescence was measured at 7 degrees temporal to the fovea in normal volunteers and between 7 and 15 degrees temporal to the fovea in patients. Within session reproducibility of the measurements obtained by each investigator and interobserver reproducibility were evaluated. Results - For investigator 1 the median values of fundus autofluorescence obtained were 31.9 units for normal volunteers and 27.3 units for patients. The median largest difference in readings in normal volunteers was 5.7 units (range 1.4-13.5 units) and in patients 4.2 units (1.5-15.1 units). For investigator 2 the median values of fundus autofluorescence obtained were 28.9 units for normal volunteers and 27.4 units for patients. The median largest difference in readings in normal volunteers was 3.6 units (2.7-11.7 units), and in patients 4.1 units (1.5-9.3 units). The median interobserver difference in readings in normal volunteers was 3.3 units and for patients 6.6 units. The median greatest interobserver difference in measurements obtained for normal volunteers was 8.8 units (8.4-23.0 units) and for patients 11.1 units (7.1-40.8 units). Conclusion - Within session reproducibility of the measurements of background fundus autofluorescence was satisfactory. Although interobserver reproducibility was moderate, the variability of the measurements of fundus autofluorescence between observers appears to be small when compared with variation in fundus autofluorescence with age and disease.
Resumo:
Purpose: The authors sought to quantify neighboring and distant interpoint correlations of threshold values within the visual field in patients with glaucoma. Methods: Visual fields of patients with confirmed or suspected glaucoma were analyzed (n = 255). One eye per patient was included. Patients were examined using the 32 program of the Octopus 1-2-3. Linear regression analysis among each of the locations and the rest of the points of the visual field was performed, and the correlation coefficient was calculated. The degree of correlation was categorized as high (r > 0.66), moderate (0.66 = r > 0.33), or low (r = 0.33). The standard error of threshold estimation was calculated. Results: Most locations of the visual field had high and moderate correlations with neighboring points and with distant locations corresponding to the same nerve fiber bundle. Locations of the visual field had low correlations with those of the opposite hemifield, with the exception of locations temporal to the blind spot. The standard error of threshold estimation increased from 0.6 to 0.9 dB with an r reduction of 0.1. Conclusion: Locations of the visual field have highest interpoint correlation with neighboring points and with distant points in areas corresponding to the distribution of the retinal nerve fiber layer. The quantification of interpoint correlations may be useful in the design and interpretation of visual field tests in patients with glaucoma.
Resumo:
Genetic risk factors for chronic kidney disease (CKD) are being identified through international collaborations. By comparison, epigenetic risk factors for CKD have only recently been considered using population-based approaches. DNA methylation is a major epigenetic modification that is associated with complex diseases, so we investigated methylome-wide loci for association with CKD. A total of 485,577 unique features were evaluated in 255 individuals with CKD (cases) and 152 individuals without evidence of renal disease (controls). Following stringent quality control, raw data were quantile normalized and β values calculated to reflect the methylation status at each site. The difference in methylation status was evaluated between cases and controls with resultant P values adjusted for multiple testing. Genes with significantly increased and decreased levels of DNA methylation were considered for biological relevance by functional enrichment analysis using KEGG pathways in Partek Genomics Suite. Twenty-three genes, where more than one CpG per loci was identified with Padjusted < 10−8, demonstrated significant methylation changes associated with CKD and additional support for these associated loci was sought from published literature. Strong biological candidates for CKD that showed statistically significant differential methylation include CUX1, ELMO1, FKBP5, INHBA-AS1, PTPRN2, and PRKAG2 genes; several genes are differentially methylated in kidney tissue and RNA-seq supports a functional role for differential methylation in ELMO1 and PRKAG2 genes. This study reports the largest, most comprehensive, genome-wide quantitative evaluation of DNA methylation for association with CKD. Evidence confirming methylation sites influence development of CKD would stimulate research to identify epigenetic therapies that might be clinically useful for CKD.
Resumo:
Mineral exploration programmes around the world use data from remote sensing, geophysics and direct sampling. On a regional scale, the combination of airborne geophysics and ground-based geochemical sampling can aid geological mapping and economic minerals exploration. The fact that airborne geophysical and traditional soil-sampling data are generated at different spatial resolutions means that they are not immediately comparable due to their different sampling density. Several geostatistical techniques, including indicator cokriging and collocated cokriging, can be used to integrate different types of data into a geostatistical model. With increasing numbers of variables the inference of the cross-covariance model required for cokriging can be demanding in terms of effort and computational time. In this paper a Gaussian-based Bayesian updating approach is applied to integrate airborne radiometric data and ground-sampled geochemical soil data to maximise information generated from the soil survey, to enable more accurate geological interpretation for the exploration and development of natural resources. The Bayesian updating technique decomposes the collocated estimate into a production of two models: prior and likelihood models. The prior model is built from primary information and the likelihood model is built from secondary information. The prior model is then updated with the likelihood model to build the final model. The approach allows multiple secondary variables to be simultaneously integrated into the mapping of the primary variable. The Bayesian updating approach is demonstrated using a case study from Northern Ireland where the history of mineral prospecting for precious and base metals dates from the 18th century. Vein-hosted, strata-bound and volcanogenic occurrences of mineralisation are found. The geostatistical technique was used to improve the resolution of soil geochemistry, collected one sample per 2 km2, by integrating more closely measured airborne geophysical data from the GSNI Tellus Survey, measured over a footprint of 65 x 200 m. The directly measured geochemistry data were considered as primary data in the Bayesian approach and the airborne radiometric data were used as secondary data. The approach produced more detailed updated maps and in particular maximized information on mapped estimates of zinc, copper and lead. Greater delineation of an elongated northwest/southeast trending zone in the updated maps strengthened the potential to investigate stratabound base metal deposits.
Resumo:
Background: Cataract extraction is the most commonly performed surgery in the National Health Service. Myopia increases the risk of postoperative rhegmatogenous retinal detachment (RRD). The aim of this study was to determine the incidence and rate of RRD seven years after cataract extraction in highly myopic eyes. Methods: Retrospective review was performed of notes of all high myopes (axial length 26.0 mm or more) who underwent cataract extraction during the study period in one centre. Results: 84 eyes met the study criteria. Follow-up time from surgery was 93 to 147 months (median 127 months). The average axial length was 28.72 mm (sd 1.37). Two eyes developed post-operative RRD; the incidence was 2.4% and the rate one RRD per 441.6 person-years. The results of 15 other studies on the incidence of RRD after cataract extraction in high myopia were pooled and combined with our estimate. Conclusion: Both patients in our study who developed RRD had risk factors for this complication as well as high myopia. Risk factors are discussed in the light of our results and the pooled estimate. Our follow-up time is longer than most. Future case series should calculate rates to allow meaningful comparison of case series. © The Ulster Medical Society, 2009.
Resumo:
We have developed a model to predict the post-collision brightness increase of sub-catastrophic collisions between asteroids and to evaluate the likelihood of a survey detecting these events. It is based on the cratering scaling laws of Holsapple and Housen (2007) and models the ejecta expansion following an impact as occurring in discrete shells each with their own velocity. We estimate the magnitude change between a series of target/impactor pairs, as- suming it is given by the increase in reflecting surface area within a photometric aperture due to the resulting ejecta. As expected the photometric signal increases with impactor size, but we find also that the photometric signature decreases rapidly as the target aster- oid diameter increases, due to gravitational fallback. We have used the model results to make an estimate of the impactor diameter for the (596) Scheila collision of D = 49 − 65m depending on the impactor taxonomy, which is broadly consistent with previous estimates. We varied both the strength regime (highly porous and sand/cohesive soil) and the tax- onomic type (S-, C- and D-type) to examine the effect on the magnitude change, finding that it is significant at early stages but has only a small effect on the overall lifetime of the photometric signal. Combining the results of this model with the collision frequency estimates of Bottke et al. (2005), we find that low-cadence surveys of ∼one visit per luna- tion will be insensitive to impacts on asteroids with D < 20km if relying on photometric detections.
Resumo:
Many organic molecules have strong absorption bands which can be accessed by ultraviolet short pulse lasers to produce efficient ionization. This resonant multiphoton ionization scheme has already been exploited as an ionization source in time-of-flight mass spectrometers used for environmental trace analysis. In the present work we quantify the ultimate potential of this technique by measuring absolute ion yields produced from the interaction of 267 nm femtosecond laser pulses with the organic molecules indole and toluene, and gases Xe, N2 and O2. Using multiphoton ionization cross sections extracted from these results, we show that the laser pulse parameters required for real-time detection of aromatic molecules at concentrations of one part per trillion in air and a limit of detection of a few attomoles are achievable with presently available commercial laser systems. The potential applications for the analysis of human breath, blood and tissue samples are discussed.
Resumo:
Evidence for osseous technologies has featured in excavation reports from Southeast Asia for almost a century and from archaeological deposits as old as 43,000 years BP. However, in contrast to the significance that is placed on this technology in other parts of the world, until recently, Southeast Asian assemblages have drawn only very limited attention. Concentrating on evidence from Malaysia, the current paper examines one element of this inventory of tools: the deliberate modification of pig canines and the means by which such alteration can be distinguished from patterns of natural tooth wear. Particular attention is paid to the bearded pig (Sus barbatus), as it is one of the two species of wild boar in Malaysia whose tusks are most likely to have been used by prehistoric toolmakers. Reference is also made to wider, regional ethnographic examples of known tusk implements and their accredited uses to further assist in the identification process. Distinguishing criteria for worked tusk are formulated according to the type and extent of modification. These criteria are then applied to archaeological specimens recovered from two prehistoric cave sites in Malaysia, Gua Bintong and Niah Cave.
Resumo:
Implications Provision of environmental enrichment in line with that required by welfare-based quality assurance schemesdoes not always appear to lead to clear improvements in broiler chicken welfare. This research perhaps serves to highlightthe deficit in information regarding the ‘real world’ implications of enrichment with perches, string and straw bales.
Introduction Earlier work showed that provision of natural light and straw bales improved leg health in commercial broilerchickens (Bailie et al., 2013). This research aimed to determine if additional welfare benefits were shown in windowedhouses by increasing straw bale provision (Study 1), or by providing perches and string in addition to straw bales (Study 2).
Material and methods Commercial windowed houses in Northern Ireland containing ~23,000 broiler chickens (placed inhouses as hatched) were used in this research which took place in 2011. In Study 1 two houses on a single farm wereassigned to one of two treatments: (1) 30 straw bales per house (1 bale/44m2), or (2) 45 straw bales per house (1bale/29m2). Bales of wheat straw, each measuring 80cm x 40cm x 40cm were provided from day 10 of the rearing cycle,as in Bailie et al. (2013). Treatments were replicated over 6 production cycles (using 276,000 Ross 308 and Cobb birds),and were swapped between houses in each replicate. In Study 2, four houses on a single farm were assigned to 1 of 4treatments in a 2 x 2 factorial design. Treatments involved 2 levels of access to perches (present (24/house), or absent), and2 levels of access to string (present (24/house), or absent), and both types of enrichment were provided from the start of thecycle. Each perch consisted of a horizontal, wooden beam (300 cm x 5 cm x 5cm) with a rounded upper edge resting on 2supports (15 cm high). In the string treatment, 6 pieces of white nylon string (60 cm x 10 mm) were tied at their mid-pointto the wire above each of 4 feeder lines. Thirty straw bales were also provided per house from day 10. This study wasreplicated over 4 production cycles using 368,000 Ross 308 birds. In both studies behaviour was observed between 0900and 1800 hours in weeks 3-5 of the cycle. In Study 1, 8 focal birds were selected in each house each week, and generalactivity, exploratory and social behaviours recorded directly for 10 minutes. In Study 2, 10 minute video recordings weremade of 6 different areas (that did not contain enrichment) of each house each week. The percentage of birds engaged inlocomotion or standing was determined through scan sampling these recordings at 120 second intervals. Four perches andfour pieces of string were filmed for 25 mins in each house that contained these enrichments on one day per week. The totalnumber of times the perch or string was used was recorded, along with the duration of each bout. In both studies, gaitscores (0 (perfect) to 5 (unable to walk)) and latency to lie (measured in seconds from when a bird had been encouraged tostand) were recorded in 25 birds in each house each week. Farm and abattoir records were also used in both studies todetermine the number of birds culled for leg and other problems, mortality levels, slaughter weights, and levels of pododermatitis and hock burn. Data were analysed using SPSS (version 20.0) and treatment and age effects on behaviouralparameters were determined in normally distributed data using ANOVA (‘Straw bale density*week’, or‘string*perches*week’ as appropriate), and in non-normally distributed data using Kuskall-Wallace tests (P<0.05 forsignificance) . Treatment (but not age) effects on performance and health data were determined using the same testsdepending on normality of data.
Results Average slaughter weight, and levels of mortality, culling, hock burn and pododermatitis were not affected bytreatment in either study (P<0.05). In Study 1 straw bale (SB) density had no significant effect on the frequency orduration of behaviours including standing, walking, ground pecking, dust bathing, pecking at bales or aggression, or onaverage gait score (P>0.05). However, the average latency to lie was greater when fewer SB were provided (30SB 23.38s,45SB 18.62s, P<0.01). In Study 2 there was an interaction between perches (Pe) and age in lying behaviour, with higherpercentages of birds observed lying in the Pe treatment during weeks 4 and 5 (week 3 +Pe 77.0 -Pe 80.9, week 4 +Pe 79.5 -Pe 75.2, week 5 +Pe 78.4 -Pe 76.2, P<0.02). There was also a significant interaction between string (S) and age inlocomotory behaviour, with higher percentages of birds observed in locomotion in the string treatment during week 3 butnot weeks 4 and 5 (week 3 +S 4.9 -S 3.9, week 4 +S 3.3 -S 3.7, week 5 +S 2.6 -S 2.8, P<0.04). There was also aninteraction between S and age in average gait scores, with lower gait scores in the string treatment in weeks 3 and 5 (week3: +S 0.7, -S 0.9, week 4: +S 1.5, -S 1.4, week 5: +S 1.9, -S 2.0, P<0.05). On average per 25 min observation there were15.1 (±13.6) bouts of perching and 19.2 (±14.08) bouts of string pecking, lasting 117.4 (±92.7) and 4.2 (±2.0) s for perchesand string, respectively.
Conclusion Increasing straw bale levels from 1 bale/44m2 to 1 bale/29m2 floor space does not appear to lead to significantimprovements in the welfare of broilers in windowed houses. The frequent use of perches and string suggests that thesestimuli have the potential to improve welfare. Provision of string also appeared to positively influence walking ability.However, this effect was numerically small, was only shown in certain weeks and was not reflected in the latency to lie.Further research on optimum design and level of provision of enrichment items for broiler chickens is warranted. Thisshould include measures of overall levels of activity (both in the vicinity of, and away from, enrichment items).
Resumo:
Introduction
The use of video capture of lectures in Higher Education is not a recent occurrence with web based learning technologies including digital recording of live lectures becoming increasing commonly offered by universities throughout the world (Holliman and Scanlon, 2004). However in the past decade the increase in technical infrastructural provision including the availability of high speed broadband has increased the potential and use of videoed lecture capture. This had led to a variety of lecture capture formats including pod casting, live streaming or delayed broadcasting of whole or part of lectures.
Additionally in the past five years there has been a significant increase in the popularity of online learning, specifically via Massive Open Online Courses (MOOCs) (Vardi, 2014). One of the key aspects of MOOCs is the simulated recording of lecture like activities. There has been and continues to be much debate on the consequences of the popularity of MOOCs, especially in relation to its potential uses within established University programmes.
There have been a number of studies dedicated to the effects of videoing lectures.
The clustered areas of research in video lecture capture have the following main themes:
• Staff perceptions including attendance, performance of students and staff workload
• Reinforcement versus replacement of lectures
• Improved flexibility of learning
• Facilitating engaging and effective learning experiences
• Student usage, perception and satisfaction
• Facilitating students learning at their own pace
Most of the body of the research has concentrated on student and faculty perceptions, including academic achievement, student attendance and engagement (Johnston et al, 2012).
Generally the research has been positive in review of the benefits of lecture capture for both students and faculty. This perception coupled with technical infrastructure improvements and student demand may well mean that the use of video lecture capture will continue to increase in frequency in the next number of years in tertiary education. However there is a relatively limited amount of research in the effects of lecture capture specifically in the area of computer programming with Watkins 2007 being one of few studies . Video delivery of programming solutions is particularly useful for enabling a lecturer to illustrate the complex decision making processes and iterative nature of the actual code development process (Watkins et al 2007). As such research in this area would appear to be particularly appropriate to help inform debate and future decisions made by policy makers.
Research questions and objectives
The purpose of the research was to investigate how a series of lecture captures (in which the audio of lectures and video of on-screen projected content were recorded) impacted on the delivery and learning of a programme of study in an MSc Software Development course in Queen’s University, Belfast, Northern Ireland. The MSc is conversion programme, intended to take graduates from non-computing primary degrees and upskill them in this area. The research specifically targeted the Java programming module within the course. It also analyses and reports on the empirical data from attendances and various video viewing statistics. In addition, qualitative data was collected from staff and student feedback to help contextualise the quantitative results.
Methodology, Methods and Research Instruments Used
The study was conducted with a cohort of 85 post graduate students taking a compulsory module in Java programming in the first semester of a one year MSc in Software Development. A pre-course survey of students found that 58% preferred to have available videos of “key moments” of lectures rather than whole lectures. A large scale study carried out by Guo concluded that “shorter videos are much more engaging” (Guo 2013). Of concern was the potential for low audience retention for videos of whole lectures.
The lecturers recorded snippets of the lecture directly before or after the actual physical delivery of the lecture, in a quiet environment and then upload the video directly to a closed YouTube channel. These snippets generally concentrated on significant parts of the theory followed by theory related coding demonstration activities and were faithful in replication of the face to face lecture. Generally each lecture was supported by two to three videos of durations ranging from 20 – 30 minutes.
Attendance
The MSc programme has several attendance based modules of which Java Programming was one element. In order to assess the consequence on attendance for the Programming module a control was established. The control used was a Database module which is taken by the same students and runs in the same semester.
Access engagement
The videos were hosted on a closed YouTube channel made available only to the students in the class. The channel had enabled analytics which reported on the following areas for all and for each individual video; views (hits), audience retention, viewing devices / operating systems used and minutes watched.
Student attitudes
Three surveys were taken in regard to investigating student attitudes towards the videoing of lectures. The first was before the start of the programming module, then at the mid-point and subsequently after the programme was complete.
The questions in the first survey were targeted at eliciting student attitudes towards lecture capture before they had experienced it in the programme. The midpoint survey gathered data in relation to how the students were individually using the system up to that point. This included feedback on how many videos an individual had watched, viewing duration, primary reasons for watching and the result on attendance, in addition to probing for comments or suggestions. The final survey on course completion contained questions similar to the midpoint survey but in summative view of the whole video programme.
Conclusions and Outcomes
The study confirmed findings of other such investigations illustrating that there is little or no effect on attendance at lectures. The use of the videos appears to help promote continual learning but they are particularly accessed by students at assessment periods. Students respond positively to the ability to access lectures digitally, as a means of reinforcing learning experiences rather than replacing them. Feedback from students was overwhelmingly positive indicating that the videos benefited their learning. Also there are significant benefits to part recording of lectures rather than recording whole lectures. The behaviour viewing trends analytics suggest that despite the increase in the popularity of online learning via MOOCs and the promotion of video learning on mobile devices in fact in this study the vast majority of students accessed the online videos at home on laptops or desktops However, in part, this is likely due to the nature of the taught subject, that being programming.
The research involved prerecording the lecture in smaller timed units and then uploading for distribution to counteract existing quality issues with recording entire live lectures. However the advancement and consequential improvement in quality of in situ lecture capture equipment may well help negate the need to record elsewhere. The research has also highlighted an area of potentially very significant use for performance analysis and improvement that could have major implications for the quality of teaching. A study of the analytics of the viewings of the videos could well provide a quick response formative feedback mechanism for the lecturer. If a videoed lecture either recorded live or later is a true reflection of the face to face lecture an analysis of the viewing patterns for the video may well reveal trends that correspond with the live delivery.
Resumo:
Objective: To evaluate temporal changes in GCF levels of substance P, cathepsin G, interleukin 1 beta (IL-1&beta), neutrophil elastase and alpha1-antitrypsin (&alpha1AT) during development of and recovery from experimental gingivitis. Methods: Healthy human volunteers participated in a split-mouth study: experimental gingivitis was induced using a soft vinyl splint to cover test teeth during brushing over 21 days, after which normal brushing was resumed. Modified gingival index (MGI), gingival bleeding index (BI) and modified Quigley and Hein plaque index (PI) were assessed and 30-second GCF samples taken from 4 paired test and contra-lateral control sites in each subject at days 0, 7, 14, 21, 28 and 42. GCF volume was measured and site-specific quantification of one analyte per GCF sample was performed using radioimmunoassay (substance P), enzyme assay (cathepsin G) or ELISA (IL-1&beta, elastase, &alpha1AT). Site-specific data were analysed using analysis of repeated measurements and paired sample tests. Results: 56 subjects completed the study. All measurements at baseline (day 0) and at control sites throughout the study were low. Clinical indices and GCF volumes at the test sites increased from day 0, peaking at day 21 (difference between test and control for PI, BI, MGI and GCF all p<0.0001) and decreased again to control levels by day 28. Levels of four inflammatory markers showed a similar pattern, with significant differences between test and control apparent at 7 days (substance P p=0.0015; cathepsin G p=0.029; IL-1&beta p=0.026; elastase p=0.0129) and peaking at day 21 (substance P p=0.0023; cathepsin G, IL-1&beta and elastase all p<0.0001). Levels of &alpha1AT showed no apparent pattern over the course of the study. Conclusion: GCF levels of substance P, cathepsin G, IL-1&beta and neutrophil elastase have the potential to act as early markers of experimentally-induced gingival inflammation.
Resumo:
The practical knowledge has characteristics of a process with peculiar idiosyncrasies that require disruption with preconceived ideas, dialogue, negotiation and joint action. The knowledge underlying remains unclear despite of being what informs decision making. It is academia’s responsibility to unveil and nominate knowledge and that is the reason why we conducted two studies with clinical nurses. The aim is to understand the social representation that nurses make of their knowledge about nursing and analyze their clinical practices. In one of the studies, based on the theoretical-methodological referential of social representations, we used the technique of free association of words with the stimulus “knowledge in nursing”. In another study, developed within a naturalistic context and under the “Grounded Theory” referential, we used non-participative observation and explanatory interviews. From the first study we identified the structure of social representations of knowledge in nursing, from which emerged the central core constituted by four elements (Investigation, Wisdom, help Relation, Competence) and a second periphery with one element (Reflection). With the second study we identified that decisions are made within a dynamic, systematic and continuous process of diagnostic evaluation and clinical intervention using the various types of knowledge (e.g. clinic, experiential, scientific, personal). We concluded that the various types of knowledge in nursing, represented by the expressions mentioned above, are systematically and creatively mobilized within the dynamic process of diagnostic evaluation and clinical intervention. It is therefore important to unveil and nominate the different knowledge implicit in the clinical practice and Academia should be responsible for that task.
Resumo:
UNVEILING PROFESSIONAL KNOWLEDGES: A SCOPE OF HIGHER EDUCATION The practical knowledge has characteristics of a process with peculiar idiosyncrasies that require disruption with preconceived ideas, dialogue, negotiation and joint action. The knowledge underlying remains unclear despite of being what informs decision making. It is academia’s responsibility to unveil and nominate knowledge and that is the reason why we conducted two studies with clinical nurses. The aim is to understand the social representation that nurses make of their knowledge about nursing and analyze their clinical practices. In one of the studies, based on the theoretical-methodological referential of social representations, we used the technique of free association of words with the stimulus “knowledge in nursing”. In another study, developed within a naturalistic context and under the “Grounded Theory” referential, we used non-participative observation and explanatory interviews. From the first study we identified the structure of social representations of knowledge in nursing, from which emerged the central core constituted by four elements (Investigation, Wisdom, help Relation, Competence) and a second periphery with one element (Reflection). With the second study we identified that decisions are made within a dynamic, systematic and continuous process of diagnostic evaluation and clinical intervention using the various types of knowledge (e.g. clinic, experiential, scientific, personal). We concluded that the various types of knowledge in nursing, represented by the expressions mentioned above, are systematically and creatively mobilized within the dynamic process of diagnostic evaluation and clinical intervention. It is therefore important to unveil and nominate the different knowledge implicit in the clinical practice and Academia should be responsible for that task.
Resumo:
Background Complex medication regimens may adversely affect compliance and treatment outcomes. Complexity can be assessed with the medication regimen complexity index (MRCI), which has proved to be a valid, reliable tool, with potential uses in both practice and research. Objective To use the MRCI to assess medication regimen complexity in institutionalized elderly people. Setting Five nursing homes in mainland Portugal. Methods A descriptive, cross-sectional study of institutionalized elderly people (n = 415) was performed from March to June 2009, including all inpatients aged 65 and over taking at least one medication per day. Main outcome measure Medication regimen complexity index. Results The mean age of the sample was 83.9 years (±6.6 years), and 60.2 % were women. The elderly patients were taking a large number of drugs, with 76.6 % taking more than five medications per day. The average medication regimen complexity was 18.2 (±SD = 9.6), and was higher in the females (p < 0.001). The most decisive factors contributing to the complexity were the number of drugs and dosage frequency. In regimens with the same number of medications, schedule was the most relevant factor in the final score (r = 0.922), followed by pharmaceutical forms (r = 0.768) and additional instructions (r = 0.742). Conclusion Medication regimen complexity proved to be high. There is certainly potential for the pharmacist’s intervention to reduce it as part as the medication review routine in all the patients.
Resumo:
The increasing human activity has been responsible by profound changes and a constinuos degradation of the soil compartment in all the European territory. Some European policies are appearing focusing soil’s protection and the management of contaminated sites, in order to recover land for other uses. To regulate the risk assessment and the management of contaminated soils, many European member states adopted soil guideline values, as for example soil screnning values (SSV).These values are particularly useful for the the first tier of the Ecological Risk Assessment (ERA) processes of contaminated sites,especially for a first screening of sites requiring a more site-specific evaluation. Hence, the approriate definition of regional SSVs will have relevant economic impacts in the management of contaminated sites. Portugal is one of European Member States that still lack these soil guideline values. In this context, this study gaves a remarkable contribution in the generation of ecotoxicological data for soil microbiological parameters, terrestrial plants and invertebrates for the derivation of SSVs for uranium (U), cadmium (Cd) and copper (Cu), using a Portuguese natural soil, representative of a dominant type of soil in the Portuguese territory. SSVs were derived based on two methods proposed by the the Technical Guidance Document for Risk Assessment of the European Commission; namely the assessment factor method (AF) and the species sensitivity distribution (SSD) method (with some adaptations). The outputs of both methods were compared and discussed. Further, this study laid the foundation for a deeper reflection about the cut-off (hazard concentration for a given percentage of species - HCps) to be estimated from the SSDs, and to be selected for the derivation of SSVs, with the adequate level of protection. It was proven that this selection may vary for different contaminants, however a clear justification should be given, in each case. The SSvs proposed in this study were for: U (151.4 mg U kg-1dw), Cd (5.6 mg Cd kg-1dw), and Cu (58.5 mg Cu kg-1dw) These values should now be tested for their descriminating power of soils with different levels of contamination. However, this studies clarifies the approach that should be followed for the derivation of SSVs for other metals and organic contaminants, and for other dominant types of Portuguese natural soils.