358 resultados para Child Adjustment
Resumo:
Most research in the field of autism focuses on the medical and psychological characteristics of the disability. Research that focuses on caregiving emphasizes the stresses and pathological features associated with having a child with autism. As such, the more positive aspects of caregiving have been left in abeyance, portraying caregiving and autism as characterized by only negative experiences, prognoses, and outcomes. Based on mothers’ reflections, this article reports on some of the positives of caregiving. The findings provide a glimpse into a seldom studied side of caregiving—events and experiences appraised by mothers in a positive and sometimes joyous light—and the impact they have on mothers’ experiences. Furthermore, practical implications for social service professionals and families are discussed.
Resumo:
Based on models with calibrated parameters for infection, case fatality rates, and vaccine efficacy, basic childhood vaccinations have been estimated to be highly cost effective. We estimate the association of vaccination with mortality directly from survey data. Using 149 cross-sectional Demographic and Health Surveys, we determine the relationship between vaccination coverage and under five mortality at the survey cluster level. Our data include approximately one million children in 68,490 clusters in 62 countries. We consider the childhood measles, Bacille Calmette-Guérin (BCG), Diphtheria-Pertussis-Tetanus (DPT), Polio, and maternal tetanus vaccinations. Using modified Poisson regression to estimate the relative risk of child mortality in each cluster, we also adjust for selection bias caused by the vaccination status of dead children not being reported. Childhood vaccination, and in particular measles and tetanus vaccination, is associated with substantial reductions in childhood mortality. We estimate that children in clusters with complete vaccination coverage have relative risk of mortality 0.73 (95% Confidence Interval: 0.68, 0.77) that of children in a cluster with no vaccination. While widely used, basic vaccines still have coverage rates well below 100% in many countries, and our results emphasize the effectiveness of increasing their coverage rates in order to reduce child mortality.
Resumo:
Attempts to record, understand and respond to variations in child welfare and protection reporting, service patterns and outcomes are international, numerous and longstanding. Reframing such variations as an issue of inequity between children and between families opens the way to a new approach to explaining the profound difference in intervention rates between and within countries and administrative districts. Recent accounts of variation have frequently been based on the idea that there is a binary division between bias and risk (or need). Here we propose seeing supply (bias) and demand (risk) factors as two aspects of a single system, both framed, in part, by social structures. A recent finding from a study of intervention rates in England, the 'inverse intervention law', is used to illustrate the complex ways in which a range of factors interact to produce intervention rates. In turn, this analysis raises profound moral, policy, practice and research questions about current child welfare and child protection services.
Resumo:
This report outlines the findings from a research project examining what works well in investigative interviews (ABE interviews) with child witnesses in Northern Ireland. The project was developed in collaboration with key stakeholders and was joint funded by the Department of Justice NI, NSPCC, SBNI and PSNI. While there is substantial a research literature examining the practice of forensic interview both internationally and within the UK there has been little in the way of exploration of this issue in Northern Ireland. Equally, the existing literature has tended to focus on a ‘deficit’ approach, identifying areas of poor practice with limited recognition of the practical difficulties interview practitioners face or what works well for them in practice. This study aimed to address these gaps by adopting an ‘appreciative inquiry’ approach to explore stakeholder perspectives on what is working well within ABE current practice and identify what can be built on to deliver optimal practice.
Resumo:
Static timing analysis provides the basis for setting the clock period of a microprocessor core, based on its worst-case critical path. However, depending on the design, this critical path is not always excited and therefore dynamic timing margins exist that can theoretically be exploited for the benefit of better speed or lower power consumption (through voltage scaling). This paper introduces predictive instruction-based dynamic clock adjustment as a technique to trim dynamic timing margins in pipelined microprocessors. To this end, we exploit the different timing requirements for individual instructions during the dynamically varying program execution flow without the need for complex circuit-level measures to detect and correct timing violations. We provide a design flow to extract the dynamic timing information for the design using post-layout dynamic timing analysis and we integrate the results into a custom cycle-accurate simulator. This simulator allows annotation of individual instructions with their impact on timing (in each pipeline stage) and rapidly derives the overall code execution time for complex benchmarks. The design methodology is illustrated at the microarchitecture level, demonstrating the performance and power gains possible on a 6-stage OpenRISC in-order general purpose processor core in a 28nm CMOS technology. We show that employing instruction-dependent dynamic clock adjustment leads on average to an increase in operating speed by 38% or to a reduction in power consumption by 24%, compared to traditional synchronous clocking, which at all times has to respect the worst-case timing identified through static timing analysis.
Resumo:
The review aimed to investigate two central issues.
1.To what extent is there evidence that poverty increases the amount of child abuse and neglect (CAN), and/or affects the nature of child abuse and neglect? How does this occur, how large are these effects and to whom do they apply?
2.To what extent is there evidence that CAN increases poverty later in life, how large are these effects and to whom do they apply?
Within these two issues evidence about equality and diversity, and cost were considered throughout.
Resumo:
AIM: To investigate the safety and potential savings of decreasing medication use in low-risk patients with ocular hypertension (OH).
METHODS: Patients with OH receiving pressure-lowering medication identified by medical record review at a university hospital underwent examination by a glaucoma specialist with assessment of visual field (VF), vertical cup-to-disc ratio (vCDR), central corneal thickness and intraocular pressure (IOP). Subjects with estimated 5-year risk of glaucoma conversion <15% were asked to discontinue ≥1 medication, IOP was remeasured 1 month later and risk was re-evaluated at 1 year.
RESULTS: Among 212 eyes of 126 patients, 44 (20.8%) had 5-year risk >15% and 14 (6.6%) had unreliable baseline VF. At 1 month, 15 patients (29 eyes, 13.7%) defaulted follow-up or refused to discontinue medication and 11 eyes (5.2%) had risk >15%. The remaining 69 patients (107 eyes, 50.7%) successfully discontinued 141 medications and completed 1-year follow-up. Mean IOP (20.5±2.65 mm Hg vs 20.3±3.40, p=0.397) did not change, though mean VF pattern SD (1.58±0.41 dB vs 1.75±0.56 dB, p=0.001) and glaucoma conversion risk (7.31±3.74% vs 8.76±6.28%, p=0.001) increased at 1 year. Mean defect decreased (-1.42±1.60 vs -1.07±1.52, p=0.022). One eye (0.47%) developed a repeatable VF defect and 13 eyes (6.1%) had 5-year risk >15% at 1 year. The total 1-year cost of medications saved was US$4596.
CONCLUSIONS: Nearly half (43.9%) of low-risk OH eyes in this setting could safely reduce medications over 1 year, realising substantial savings.Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Resumo:
PURPOSE: To quantify the association between siblings in age-related nuclear cataract, after adjusting for known environmental and personal risk factors. METHODS: All participants (probands) in the Salisbury Eye Evaluation (SEE) project and their locally resident siblings underwent digital slit lamp photography and were administered a questionnaire to assess risk factors for cataract including: age, gender, lifetime sun exposure, smoking and diabetes history, and use of alcohol and medications such as estrogens and steroids. In addition, blood pressure, body mass index, and serum antioxidants were measured in all participants. Lens photographs were graded by trained observers masked to the subjects' identity, using the Wilmer Cataract Grading System. The odds ratio for siblings for affectedness with nuclear cataract and the sibling correlation of nuclear cataract grade, after adjusting for covariates, were estimated with generalized estimating equations. RESULTS: Among 307 probands (mean age, 77.6 +/- 4.5 years) and 434 full siblings (mean age, 72.4 +/- 7.4 years), the average sibship size was 2.7 per family. After adjustment for covariates, the probability of development of nuclear cataract was significantly increased (odds ratio [OR] = 2.07, 95% confidence interval [CI], 1.30-3.30) among individuals with a sibling with nuclear cataract (nuclear grade > or = 3.0). The final fitted model indicated a magnitude of heritability for nuclear cataract of 35.6% (95% CI: 21.0%-50.3%) after adjustment for the covariates. CONCLUSIONS: Findings in this study are consistent with a genetic effect for age-related nuclear cataract, a common and clinically significant form of lens opacity.