623 resultados para "at risk" for school failure
Resumo:
We examine how firm characteristics, particularly the degree of firm complexity and the firm’s need for specialty knowledge, affect the relationship between corporate governance and the risk of bankruptcy. We find that having larger boards reduces the risk of bankruptcy only for complex firms. Our results also suggest that the proportion of inside directors on the board is inversely associated with the risk of bankruptcy in firms that require more specialist knowledge, and that the reverse is true in technically unsophisticated firms. The results further reveal that the additional explanatory power from corporate governance variables becomes stronger as the time to bankruptcy is increased, implying that although corporate governance variables are important predictors, governance changes are likely to be too late to save a firm on the verge of bankruptcy.
Resumo:
The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.
Resumo:
It is well established that the traditional taxonomy and nomenclature of Chironomidae relies on adult males whose usually characteristic genitalia provide evidence of species distinction. In the early days some names were based on female adults of variable distinctiveness – but females are difficult to identify (Ekrem et al. 2010) and many of these names remain dubious. In Russia especially, a system based on larval morphology grew in parallel to the conventional adult-based system. The systems became reconciled with the studies that underlay the production of the Holarctic generic keys to Chironomidae, commencing notably with the larval volume (Wiederholm, 1983). Ever since Thienemann’s pioneering studies, it has been evident that the pupa, notably the cast skins (exuviae) provide a wealth of features that can aid in identification (e.g. Wiederholm, 1986). Furthermore, the pupae can be readily associated with name-bearing adults when a pharate (‘cloaked’) adult stage is visible within the pupa. Association of larvae with the name-bearing later stages has been much more difficult, time-consuming and fraught with risk of failure. Yet it is identification of the larval stage that is needed by most applied researchers due to the value of the immature stages of the family in aquatic monitoring for water quality, although the pupal stage also has advocates (reviewed by Sinclair & Gresens, 2008). Few use the adult stage for such purposes as their provenance and association with the water body can be verified only by emergence trapping, and sampling of adults lies outside regular aquatic monitoring protocols.
Resumo:
Deprivation has previously been shown to be an independent risk factor for the high prevalence of malnutrition observed in COPD (Collins et al., 2010). It has been suggested the socioeconomic gradient observed in COPD is greater than any other chronic disease (Prescott & Vestbo, 1999). The current study aimed to examine the infl uence of disease severity and social deprivation on malnutrition risk in outpatients with COPD. 424 COPD outpatients were screened using the ‘Malnutrition Universal Screening Tool’ (‘MUST’). COPD disease severity was recorded in accordance with the GOLD criteria and deprivation was established according to the patient’s geographical location (postcode) at the time of nutritional screening using the UK Government’s Index of Multiple Deprivation (IMD). IMD ranks postcodes from 1 (most deprived) to 32,482 (least deprived). Disease severity was posi tively associated with an increased prevalence of malnutrition risk (p < 0.001) both within and between groups, whilst rank IMD was negatively associated with malnutrition (p = 0.020), i.e. those residing in less deprived areas were less likely to be malnourished. Within each category of disease severity the prevalence of malnutrition was two-fold greater in those residing in the most deprived areas compared to those residing in the least deprived areas. This study suggests that deprivation and disease severity are independent risk factors for malnutrition in COPD both contributing to the widely variable prevalence of malnutrition. Consideration of these issues could assist with the targeted nutritional management of these patients.
Resumo:
This thesis is a comprehensive study of deformation and failure mechanisms in bone at nano- and micro-scale levels. It explores the mechanical behaviour of osteopontin-hydroxyapatite interfaces and mineralized collagen fibril arrays, through atomistic molecular dynamics and finite element simulations. This thesis shows some main factors contributing to the excellent material properties of bone and provides some guidelines for development of new artificial biological materials and medical implants.
Resumo:
Through an examination of Wallace v Kam, this article considers and evaluates the law of causation in the specific context of a medical practitioner’s duty to provide information to patients concerning material risks of treatment. To supply a contextual background for the analysis which follows, Part II summarises the basic principles of causation law, while Part III provides an overview of the case and the reasoning adopted in the decisions at first instance and on appeal. With particular emphasis upon the reasoning in the courts of appeal, Part IV then examines the implications of the case in the context of other jurisprudence in this field and, in so doing, provides a framework for a structured consideration of causation issues in future non-disclosure cases under the Australian civil liability legislation. As will become clear, Wallace was fundamentally decided on the basis of policy reasoning centred upon the purpose behind the legal duty violated. Although the plurality in Rogers v Whitaker rejected the utility of expressions such as ‘the patient’s right of self-determination’ in this context, some Australian jurisprudence may be thought to frame the practitioner’s duty to warn in terms of promoting a patient’s autonomy, or right to decide whether to submit to treatment proposed. Accordingly, the impact of Wallace upon the protection of this right, and the interrelation between it and the duty to warn’s purpose, is investigated. The analysis in Part IV also evaluates the courts’ reasoning in Wallace by questioning the extent to which Wallace’s approach to liability and causal connection in non-disclosure of risk cases: depends upon the nature and classification of the risk(s) in question; and can be reconciled with the way in which patients make decisions. Finally, Part V adopts a comparative approach by considering whether the same decision might be reached if Wallace was determined according to English law.
Resumo:
Objectives: Few studies have assessed the risk and impact of lymphedema among women treated for endometrial cancer. We aimed to quantify cumulative incidence of, and risk factors for developing lymphedema following treatment for endometrial cancer and estimate absolute risk for individuals. Further, we report unmet needs for help with lymphedema-specific issues. Methods: Women treated for endometrial cancer (n = 1243) were followed-up 3–5 years after diagnosis; a subset of 643 completed a follow-up survey that asked about lymphedema and lymphedema-related support needs. We identified a diagnosis of secondary lymphedema from medical records or self-report. Multivariable logistic regression was used to evaluate risk factors and estimates. Results: Overall, 13% of women developed lymphedema. Risk varied markedly with the number of lymph nodes removed and, to a lesser extent, receipt of adjuvant radiation or chemotherapy treatment, and use of nonsteroidal anti-inflammatory drugs (pre-diagnosis). The absolute risk of developing lymphedema was > 50% for women with 15 + nodes removed and 2–3 additional risk factors, 30–41% for those with 15 + nodes removed plus 0–1 risk factors or 6–14 nodes removed plus 3 risk factors, but ≤ 8% for women with no nodes removed or 1–5 nodes but no additional risk factors. Over half (55%) of those who developed lymphedema reported unmet need(s), particularly with lymphedema-related costs and pain. Conclusion: Lymphedema is common; experienced by one in eight women following endometrial cancer. Women who have undergone lymphadenectomy have very high risks of lymphedema and should be informed how to self-monitor for symptoms. Affected women need greater levels of support.
Resumo:
Despite ongoing improvements in behaviour change strategies, licensing models and road law enforcement measures young drivers remain significantly over-represented in fatal and non-fatal road related crashes. This paper focuses on the safety of those approaching driving age and identifies both high priority road safety messages and relevant peer-led strategies to guide the development school programs. It summarises the review in a program logic model built around the messages and identified curriculum elements, as they may be best operationalised within the licensing and school contexts in Victoria. This paper summarises a review of common deliberate risk-taking and non-deliberate unsafe driving behaviours among novice drivers, highlighting risks associated with speeding, driving while fatigued, driving while impaired and carrying passengers. Common beliefs of young people that predict risky driving were reviewed, particularly with consideration of those beliefs that can be operationalised in a behaviour change school program. Key components of adolescent risk behaviour change programs were also reviewed, which identified a number of strategies for incorporation in a school based behaviour change program, including: a well-structured theoretical design and delivery, thoughtfully considered peer-selected processes, adequate training and supervision of peer facilitators, a process for monitoring and sustainability, and interactive delivery and participant discussions. The research base is then summarised in a program logic model with further discussion about the quality of the current state of knowledge of evaluation of behaviour change programs and the need for considerable development in program evaluation.
Resumo:
Self-care management is needed for effective management of chronic kidney disease. The main aim for treatment or management of chronic kidney disease is to delay the worsening of kidney function, and to prevent or to manage the co-morbidities. Selfcare management is not easy, and patients will face many challenges, especially when they cannot get use to the new treatment plan. One of the challenges they face is dietary restriction, which is a very important aspect in any self-care management programme. Chronic kidney disease patients require a low-protein, low-sodium, low-potassium, and low-phosphorus diet. There are several strategies patients can undertake to ensure adherence, such as self-monitoring their dietary habits and type of food consumed using a food diary; involving social support, such as family members and spouse to help them to adhere to their diet restrictions; setting goals and providing positive reinforcement when they achieved the targeted goals; joining self-management programmes to equip themselves with the necessary skills so that they can better adhere to the treatment regimes, including diet restriction; and lastly, having the knowledge about their regime, and using this knowledge to help them understand and improve their adherence.
Resumo:
It is estimated that up to three per 1,000 of the Australian adult population are affected by leg ulcers. Venous ulcers are the most common cause of ulceration in the lower extremities, accounting for approximately 80-85% of leg ulcers. These debilitating, and often painful ulcers recur frequently and can affect people of all ages, though the risk increases dramatically with age (approximately 99% of those with venous ulcers in Australia are over 65 years of age). Other risk factors include a family history of leg ulceration, varicose veins, venous disease, phlebitis, deep vein thrombosis, congestive heart failure, obesity, immobility, and previous leg injury. The chronic and recurring nature of venous ulcers, in addition to reduced quality of life for the patient, and ongoing costs of care place a significant burden of disease on the patient, and health system...
Resumo:
The World Health Organization identifies road trauma as a major public health issue in every country; most notably among low-to-middle income countries. More than 90% of all road fatalities occur in these countries, although they have only 48% of all registered vehicles [1]. Unprecedented focus has been placed on reducing the global road trauma burden through the United Nations Decade of Action for Road Safety (2011-2020). China is rapidly transitioning from a nation of bicycle riders and pedestrians to one where car ownership and use is increasing. This transition presents important public health, mobility, and safety challenges. Rapid motorisation has resulted in an increased road trauma burden, shouldered disproportionately among the population. Vulnerable road users (bicyclists, pedestrians, and motorcyclists) are of particular concern, representing 70% of all road-related fatalities [1]. Furthermore, those at greatest risk of sustaining a crash-related disability are: male, older, less educated, and earning a lower income [2] and residing in urban areas [3], with higher fatality rates in north-western poorer provinces [3]. Speeding is a key factor in road crashes in China [1, 4] and is one of two risk factors targeted in the Bloomberg Philanthropies-funded Global Road Safety Program operating in two Chinese cities over five year [5] to which the first author has provided expert advice. However, little evidence exists to help understand the factors underpinning speeding behaviour. Previous research conducted by the authors in Beijing and Hangzhou explored personal, social, and legal factors relating to speeding to assist in better understanding the motivations for non-compliance with speed limits. Qualitative and quantitative research findings indicated that speeding is relatively common, including self-reported travel speeds of greater than 30 km/hour above posted speed limits [6], and that the road safety laws and enforcement practices may, in some circumstances, contribute to this [7]. Normative factors were also evident; the role of friends, family members and driving instructors were influential. Additionally, using social networks to attempt to avoid detection and penalty was reported, thereby potentially reinforcing community perceptions that speeding is acceptable [8, 9]. The authors established strong collaborative links with the Chinese Academy of Sciences and Zhejiang Police College to conduct this research. The first author has worked in both institutions for extended time periods and recognises that research must include an understanding of culturally-relevant issues if road safety is to improve in China. Future collaborations to assist in enhancing our understanding of such issues are welcomed. References [1] World Health Organization. (2009). Global status report on road safety: Time for action; Geneva. [2] Chen, H., Du, W., & Li, N. (2013). The socioeconomic inequality in traffic-related disability among Chinese adults: the application of concentration index. Accident Analysis & Prevention, 55(101-106). [3] Wang, S. Y., Li, Y. H., Chi, G. B., Xiao, S. Y., Ozanne-Smith, J., Stevenson, M., & Phillips, M. (2008). Injury-related fatalities in China: an under-recognised public-health problem. The Lancet (British edition), 372(9651), 1765-1773. [4] He, J., King, M. J., Watson, B., Rakotonirainy, A., & Fleiter, J. J. (2013). Speed enforcement in China: National, provincial and city initiatives and their success. Accident Analysis & Prevention, 50, 282-288. [5] Bhalla, K., Li, Q., Duan, L., Wang, Y., Bishai, D., & Hyder, A. A. (2013). The prevalence of speeding and drink driving in two cities in China: a mid project evaluation of ongoing road safety interventions. Injury, 44, 49-56. doi:10.1016/S0020-1383(13)70213-4. [6] Fleiter, J. J., Watson, B., & Lennon, A. (2013). Awareness of risky behaviour among Chinese drivers. Peer-reviewed paper presented at 23rd Canadian Multidisciplinary Road Safety Conference, Montréal, Québec. [7] Fleiter, J. J., Watson, B., Lennon, A., King, M. J., & Shi, K. (2009). Speeding in Australia and China: A comparison of the influence of legal sanctions and enforcement practices on car drivers. Peer-reviewd paper presented at Australasian Road Safety Research Policing Education Conference, Sydney. [8] Fleiter, J. J., Watson, B., Lennon, A., King, M. J., & Shi, K. (2011). Social influences on drivers in China. Journal of the Australasian College of Road Safety, 22(2), 29-36. [9] Fleiter, J. J., Watson, B., Guan, M. Q., Ding, J. Y., & Xu, C. (2013). Characteristics of Chinese Drivers Attending a Mandatory Training Course Following Licence Suspension. Peer-reviewed paper presented at Road Safety on Four Continents, Beijing, China.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.
Resumo:
Female greater wax moths Galleria mellonella display by wing fanning in response to bursts of ultrasonic calls produced by males. The temporal and spectral characteristics of these calls show some similarities with the echolocation calls of bats that emit frequency-modulated (FM) signals. Female G. mellonella therefore need to distinguish between the attractive signals of male conspecifics, which may lead to mating opportunities, and similar sounds made by predatory bats. We therefore predicted that (1) females would display in response to playbacks of male calls; (2) females would not display in response to playbacks of the calls of echolocating bats (we used the calls of Daubenton's bat Myotis daubentonii as representative of a typical FM echolocating bat); and (3) when presented with male calls and bat calls during the same time block, females would display more when perceived predation risk was lower. We manipulated predation risk in two ways. First, we varied the intensity of bat calls to represent a nearby (high risk) or distant (low risk) bat. Second, we played back calls of bats searching for prey (low risk) and attacking prey (high risk). All predictions were supported, suggesting that female G. mellonella are able to distinguish conspecific male mating calls from bat calls, and that they modify display rate in relation to predation risk. The mechanism (s) by which the moths separate the calls of bat and moth must involve temporal cues. Bat and moth signals differ considerably in duration, and differences in duration could be encoded by the moth's nervous system and used in discrimination.
Resumo:
Background Australian subacute inpatient rehabilitation facilities face significant challenges from the ageing population and the increasing burden of chronic disease. Foot disease complications are a negative consequence of many chronic diseases. With the rapid expansion of subacute rehabilitation inpatient services, it seems imperative to investigate the prevalence of foot disease and foot disease risk factors in this population. The primary aim of this cross-sectional study was to determine the prevalence of active foot disease and foot disease risk factors in a subacute inpatient rehabilitation facility. Methods Eligible participants were all adults admitted at least overnight into a large Australian subacute inpatient rehabilitation facility over two different four week periods. Consenting participants underwent a short non-invasive foot examination by a podiatrist utilising the validated Queensland Health High Risk Foot Form to collect data on age, sex, medical co-morbidity history, foot disease risk factor history and clinically diagnosed foot disease complications and foot disease risk factors. Descriptive statistics were used to determine the prevalence of clinically diagnosed foot disease complications, foot disease risk factors and groups of foot disease risk factors. Logistic regression analyses were used to investigate any associations between defined explanatory variables and appropriate foot disease outcome variables. Results Overall, 85 (88%) of 97 people admitted to the facility during the study periods consented; mean age 80 (±9) years and 71% were female. The prevalence (95% confidence interval) of participants with active foot disease was 11.8% (6.3 – 20.5), 32.9% (23.9 – 43.5) had multiple foot disease risk factors, and overall, 56.5% (45.9 – 66.5) had at least one foot disease risk factor. A self-reported history of peripheral neuropathy diagnosis was independently associated with having multiple foot disease risk factors (OR 13.504, p = 0.001). Conclusion This study highlights the potential significance of the burden of foot disease in subacute inpatient rehabilitation facilities. One in eight subacute inpatients were admitted with active foot disease and one in two with at least one foot disease risk factor in this study. It is recommended that further multi-site studies and management guidelines are required to address the foot disease burden in subacute inpatient rehabilitation facilities. Keywords: Subacute; Inpatient; Foot; Complication; Prevalence
Resumo:
Background Cardiovascular disease and mental health both hold enormous public health importance, both ranking highly in results of the recent Global Burden of Disease Study 2010 (GBD 2010). For the first time, the GBD 2010 has systematically and quantitatively assessed major depression as an independent risk factor for the development of ischemic heart disease (IHD) using comparative risk assessment methodology. Methods A pooled relative risk (RR) was calculated from studies identified through a systematic review with strict inclusion criteria designed to provide evidence of independent risk factor status. Accepted case definitions of depression include diagnosis by a clinician or by non-clinician raters adhering to Diagnostic and Statistical Manual of Mental Disorders (DSM) or International Classification of Diseases (ICD) classifications. We therefore refer to the exposure in this paper as major depression as opposed to the DSM-IV category of major depressive disorder (MDD). The population attributable fraction (PAF) was calculated using the pooled RR estimate. Attributable burden was calculated by multiplying the PAF by the underlying burden of IHD estimated as part of GBD 2010. Results The pooled relative risk of developing IHD in those with major depression was 1.56 (95% CI 1.30 to 1.87). Globally there were almost 4 million estimated IHD disability-adjusted life years (DALYs), which can be attributed to major depression in 2010; 3.5 million years of life lost and 250,000 years of life lived with a disability. These findings highlight a previously underestimated mortality component of the burden of major depression. As a proportion of overall IHD burden, 2.95% (95% CI 1.48 to 4.46%) of IHD DALYs were estimated to be attributable to MDD in 2010. Eastern Europe and North Africa/Middle East demonstrate the highest proportion with Asia Pacific, high income representing the lowest. Conclusions The present work comprises the most robust systematic review of its kind to date. The key finding that major depression may be responsible for approximately 3% of global IHD DALYs warrants assessment for depression in patients at high risk of developing IHD or at risk of a repeat IHD event.