902 resultados para Modelling Health Benefit
Resumo:
This study investigated the empirical differentiation of prospective memory, executive functions, and metacognition and their structural relationships in 119 elementary school children (M = 95 months, SD = 4.8 months). These cognitive abilities share many characteristics on the theoretical level and are all highly relevant in many everyday contexts when intentions must be executed. Nevertheless, their empirical relationships have not been examined on the latent level, although an empirical approach would contribute to our knowledge concerning the differentiation of cognitive abilities during childhood. We administered a computerized event-based prospective memory task, three executive function tasks (updating, inhibition, shifting), and a metacognitive control task in the context of spelling. Confirmatory factor analysis revealed that the three cognitive abilities are already empirically differentiable in young elementary school children. At the same time, prospective memory and executive functions were found to be strongly related, and there was also a close link between prospective memory and metacognitive control. Furthermore, executive functions and metacognitive control were marginally significantly related. The findings are discussed within a framework of developmental differentiation and conceptual similarities and differences.
Resumo:
An estimated 6051 tons of active substances went into the production of veterinary pharmaceuticals (VPs) for the treatment of food animals in the European Union (EU) in 2004, including 5393 tons of antibiotics and 194 tons of antiparasitics (1). With global meat production projected to increase (2) and the growing market for companion animal pharmaceuticals (3), the use of VPs will continue to increase. Although VPs may benefit the health and welfare of domestic animals and the efficiency of food animal production, they can contaminate the environment through manufacturing, treatment of animals, and disposal of carcasses, offal, urine, feces, and unused products (4) (see the chart). This contamination is a threat to nontarget species, including humans. With Spain having recently authorized marketing of a VP that was banned in South Asia in the past decade in light of environmental impacts, we recommend strengthening of current procedures and addition of a more proactive, holistic, One Health approach applicable to all VPs.
Resumo:
INTRODUCTION We apply capital interplay theory to health inequalities in Switzerland by investigating the interconnected effects of parental cultural, economic and social capitals and personal educational stream on the self-rated health of young Swiss men who live with their parents. METHODS We apply logistic regression modelling to self-rated health in original cross-sectional survey data collected during mandatory conscription of Swiss male citizens in 2010 and 2011 (n = 23,975). RESULTS In comparison with sons whose parents completed mandatory schooling only, sons with parents who completed technical college or university were significantly more likely to report very good or excellent self-rated health. Parental economic capital was an important mediating factor in this regard. Number of books in the home (parental cultural capital), family economic circumstances (parental economic capital) and parental ties to influential people (parental social capital) were also independently associated with the self-rated health of the sons. Although sons in the highest educational stream tended to report better health than those in the lowest, we found little evidence for a health-producing intergenerational transmission of capitals via the education stream of the sons. Finally, the positive association between personal education and self-rated health was stronger among sons with relatively poorly educated parents and stronger among sons with parents who were relatively low in social capital. CONCLUSIONS Our study provides empirical support for the role of capital interplays, social processes in which capitals interpenetrate or co-constitute one another, in the intergenerational production of the health of young men in Switzerland.
Resumo:
OBJECTIVES Clinical benefit response (CBR), based on changes in pain, Karnofsky performance status, and weight, is an established palliative endpoint in trials for advanced gastrointestinal cancer. We investigated whether CBR is associated with survival, and whether CBR reflects a wide-enough range of domains to adequately capture patients' perception. METHODS CBR was prospectively evaluated in an international phase III chemotherapy trial in patients with advanced pancreatic cancer (n = 311) in parallel with patient-reported outcomes (PROs). RESULTS The median time to treatment failure was 3.4 months (range: 0-6). The majority of the CBRs (n = 39) were noted in patients who received chemotherapy for at least 5 months. Patients with CBR (n = 62) had longer survival than non-responders (n = 182) (hazard ratio = 0.69; 95% confidence interval: 0.51-0.94; p = 0.013). CBR was predicted with a sensitivity and specificity of 77-80% by various combinations of 3 mainly physical PROs. A comparison between the duration of CBR (n = 62, median = 8 months, range = 4-31) and clinically meaningful improvements in the PROs (n = 100-116; medians = 9-11 months, range = 4-24) showed similar intervals. CONCLUSION CBR is associated with survival and mainly reflects physical domains. Within phase III chemotherapy trials for advanced gastrointestinal cancer, CBR can be replaced by a PRO evaluation, without losing substantial information but gaining complementary information.
Resumo:
An autonomous energy source within a human body is of key importance in the development of medical implants. This work deals with the modelling and the validation of an energy harvesting device which converts the myocardial contractions into electrical energy. The mechanism consists of a clockwork from a commercially available wrist watch. We developed a physical model which is able to predict the total amount of energy generated when applying an external excitation. For the validation of the model, a custom-made hexapod robot was used to accelerate the harvesting device along a given trajectory. We applied forward kinematics to determine the actual motion experienced by the harvesting device. The motion provides translational as well as rotational motion information for accurate simulations in three-dimensional space. The physical model could be successfully validated.
Resumo:
BACKGROUND AND PURPOSE Lesion volume on diffusion-weighted magnetic resonance imaging (DWI) before acute stroke therapy is a predictor of outcome. Therefore, patients with large volumes are often excluded from therapy. The aim of this study was to analyze the impact of endovascular treatment in patients with large DWI lesion volumes (>70 mL). METHODS Three hundred seventy-two patients with middle cerebral or internal carotid artery occlusions examined with magnetic resonance imaging before treatment since 2004 were included. Baseline data and 3 months outcome were recorded prospectively. DWI lesion volumes were measured semiautomatically. RESULTS One hundred five patients had lesions >70 mL. Overall, the volume of DWI lesions was an independent predictor of unfavorable outcome, survival, and symptomatic intracerebral hemorrhage (P<0.001 each). In patients with DWI lesions >70 mL, 11 of 31 (35.5%) reached favorable outcome (modified Rankin scale score, 0-2) after thrombolysis in cerebral infarction 2b-3 reperfusion in contrast to 3 of 35 (8.6%) after thrombolysis in cerebral infarction 0-2a reperfusion (P=0.014). Reperfusion success, patient age, and DWI lesion volume were independent predictors of outcome in patients with DWI lesions >70 mL. Thirteen of 66 (19.7%) patients with lesions >70 mL had symptomatic intracerebral hemorrhage with a trend for reduced risk with avoidance of thrombolytic agents. CONCLUSIONS There was a growing risk for poor outcome and symptomatic intracerebral hemorrhage with increasing pretreatment DWI lesion volumes. Nevertheless, favorable outcome was achieved in every third patient with DWI lesions >70 mL after successful endovascular reperfusion, whereas after poor or failed reperfusion, outcome was favorable in only every 12th patient. Therefore, endovascular treatment might be considered in patients with large DWI lesions, especially in younger patients.
Resumo:
Background: The Swiss pig population enjoys a favourable health situation. To further promote this, the Pig Health Service (PHS) conducts a surveillance program in affiliated herds: closed multiplier herds with the highest PHS-health and hygiene status have to be free from swine dysentery and progressive atrophic rhinitis and are clinically examined four times a year, including laboratory testing. Besides, four batches of pigs per year are fattened together with pigs from other herds and checked for typical symptoms (monitored fattening groups (MF)). While costly and laborious, little was known about the effectiveness of the surveillance to detect an infection in a herd. Therefore, the sensitivity of the surveillance for progressive atrophic rhinitis and swine dysentery at herd level was assessed using scenario tree modelling, a method well established at national level. Furthermore, its costs and the time until an infection would be detected were estimated, with the final aim of yielding suggestions how to optimize surveillance. Results: For swine dysentery, the median annual surveillance sensitivity was 96.7 %, mean time to detection 4.4 months, and total annual costs 1022.20 Euro/herd. The median component sensitivity of active sampling was between 62.5 and 77.0 %, that of a MF between 7.2 and 12.7 %. For progressive atrophic rhinitis, the median surveillance sensitivity was 99.4 %, mean time to detection 3.1 months and total annual costs 842.20 Euro. The median component sensitivity of active sampling was 81.7 %, that of a MF between 19.4 and 38.6 %. Conclusions: Results indicate that total sensitivity for both diseases is high, while time to detection could be a risk in herds with frequent pig trade. From all components, active sampling had the highest contribution to the surveillance sensitivity, whereas that of MF was very low. To increase efficiency, active sampling should be intensified (more animals sampled) and MF abandoned. This would significantly improve sensitivity and time to detection at comparable or lower costs. The method of scenario tree modelling proved useful to assess the efficiency of surveillance at herd level. Its versatility allows adjustment to all kinds of surveillance scenarios to optimize sensitivity, time to detection and/or costs.
Resumo:
Conclusion Using a second bone anchored hearing implant (BAHI) mounted on a testband in unilaterally implanted BAHI users to test its potential advantage pre-operatively under-estimates the advantage of two BAHIs placed on two implants. Objectives To investigate how well speech understanding with a second BAHI mounted on a testband approaches the benefit of bilaterally implanted BAHIs. Method Prospective study with 16 BAHI users. Eight were implanted unilaterally (group A) and eight were implanted bilaterally (group B). Aided speech understanding was measured. Speech was presented from the front and noise came either from the left, right, or from the front in two conditions for group A (with one BAHI, and with two BAHIs, where the second device was mounted on a testband) and in three conditions for group B (same two conditions as group A, and in addition with both BAHIs mounted on implants). Results Speech understanding in noise improved with the additional device for noise from the side of the first BAHI (+0.7 to +2.1 dB) and decreased for noise from the other side (-1.8 dB to -3.9 dB). Improvements were highest (+2.1 dB, p = 0.016) and disadvantages were smallest (-1.8 dB, p = 0.047) with both BAHIs mounted on implants. Testbands yielded smaller advantages and higher disadvantages of the additional BAHI (average difference = -0.9 dB).
Resumo:
The head impulse test (HIT) can identify a deficient vestibulo-ocular reflex (VOR) by the compensatory saccade (CS) generated once the head stops moving. The inward HIT is considered safer than the outward HIT, yet might have an oculomotor advantage given that the subject would presumably know the direction of head rotation. Here, we compare CS latencies following inward (presumed predictable) and outward (more unpredictable) HITs after acute unilateral vestibular nerve deafferentation. Seven patients received inward and outward HITs delivered at six consecutive postoperative days (POD) and again at POD 30. All head impulses were recorded by portable video-oculography. CS included those occurring during (covert) or after (overt) head rotation. Inward HITs included mean CS latencies (183.48 ms ± 4.47 SE) that were consistently shorter than those generated during outward HITs in the first 6 POD (p = 0.0033). Inward HITs induced more covert saccades compared to outward HITs, acutely. However, by POD 30 there were no longer any differences in latencies or proportions of CS and direction of head rotation. Patients with acute unilateral vestibular loss likely use predictive cues of head direction to elicit early CS to keep the image centered on the fovea. In acute vestibular hypofunction, inwardly applied HITs may risk a preponderance of covert saccades, yet this difference largely disappears within 30 days. Advantages of inwardly applied HITs are discussed and must be balanced against the risk of a false-negative HIT interpretation.
Resumo:
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.
Resumo:
BACKGROUND AND AIMS Hepatitis C (HCV) is a leading cause of morbidity and mortality in people who live with HIV. In many countries, access to direct acting antiviral agents to treat HCV is restricted to individuals with advanced liver disease (METAVIR stage F3 or F4). Our goal was to estimate the long term impact of deferring HCV treatment for men who have sex with men (MSM) who are coinfected with HIV and often have multiple risk factors for liver disease progression. METHODS We developed an individual-based model of liver disease progression in HIV/HCV coinfected men who have sex with men. We estimated liver-related morbidity and mortality as well as the median time spent with replicating HCV infection when individuals were treated in liver fibrosis stages F0, F1, F2, F3 or F4 on the METAVIR scale. RESULTS The percentage of individuals who died of liver-related complications was 2% if treatment was initiated in F0 or F1. It increased to 3% if treatment was deferred until F2, 7% if it was deferred until F3 and 22% if deferred until F4. The median time individuals spent with replicating HCV increased from 5 years if treatment was initiated in F2 to almost 15 years if it was deferred until F4. CONCLUSIONS Deferring HCV therapy until advanced liver fibrosis is established could increase liver-related morbidity and mortality in HIV/HCV coinfected individuals, and substantially prolong the time individuals spend with replicating HCV infection.
Resumo:
BACKGROUND The number of patients in need of second-line antiretroviral drugs is increasing in sub-Saharan Africa. We aimed to project the need of second-line antiretroviral therapy in adults in sub-Saharan Africa up to 2030. METHODS We developed a simulation model for HIV and applied it to each sub-Saharan African country. We used the WHO country intelligence database to estimate the number of adult patients receiving antiretroviral therapy from 2005 to 2014. We fitted the number of adult patients receiving antiretroviral therapy to observed estimates, and predicted first-line and second-line needs between 2015 and 2030. We present results for sub-Saharan Africa, and eight selected countries. We present 18 scenarios, combining the availability of viral load monitoring, speed of antiretroviral scale-up, and rates of retention and switching to second-line. HIV transmission was not included. FINDINGS Depending on the scenario, 8·7-25·6 million people are expected to receive antiretroviral therapy in 2020, of whom 0·5-3·0 million will be receiving second-line antiretroviral therapy. The proportion of patients on treatment receiving second-line therapy was highest (15·6%) in the scenario with perfect retention and immediate switching, no further scale-up, and universal routine viral load monitoring. In 2030, the estimated range of patients receiving antiretroviral therapy will remain constant, but the number of patients receiving second-line antiretroviral therapy will increase to 0·8-4·6 million (6·6-19·6%). The need for second-line antiretroviral therapy was two to three times higher if routine viral load monitoring was implemented throughout the region, compared with a scenario of no further viral load monitoring scale-up. For each monitoring strategy, the future proportion of patients receiving second-line antiretroviral therapy differed only minimally between countries. INTERPRETATION Donors and countries in sub-Saharan Africa should prepare for a substantial increase in the need for second-line drugs during the next few years as access to viral load monitoring improves. An urgent need exists to decrease the costs of second-line drugs. FUNDING World Health Organization, Swiss National Science Foundation, National Institutes of Health.