330 resultados para Simulação de conversores CC-CC
Resumo:
Introduction The last half-century of epidemiological enquiry into schizophrenia can be characterized by the search for neurological imbalances and lesions for genetic factors. The growing consensus is that these directions have failed, and there is now a growing interest in psychosocial and developmental models. Another area of recent interest is in epigenetics – the multiplication of genetic influences by environmental factors. Methods This integrative review comparatively maps current psychosocial, developmental and epigenetic models for schizophrenia epidemiology to identify crossover and theoretical gaps. Results In the flood of data that is being produced around the schizophrenia epidemiology, one of the most consistent findings is that schizophrenia is an urban syndrome. Once demographic factors have been discounted, between one-quarter and one-third of all incidence is repeatedly traced back to urbanicity – potentially threatening more established models, such as the psychosocial, genetic and developmental hypotheses. Conclusions Close analysis demonstrates how current models for schizophrenia epidemiology appear to miss the mark. Furthermore, the built environment appears to be an inextricable factor in all current models and indeed may be a valid epidemiological factor on its own. The reason the built environment hasn’t already become a de rigueur area of epidemiological research is possibly trivial – it just doesn’t attract enough science, and lacks a hero to promote it alongside other hypotheses.
Resumo:
Surveying threatened and invasive species to obtain accurate population estimates is an important but challenging task that requires a considerable investment in time and resources. Estimates using existing ground-based monitoring techniques, such as camera traps and surveys performed on foot, are known to be resource intensive, potentially inaccurate and imprecise, and difficult to validate. Recent developments in unmanned aerial vehicles (UAV), artificial intelligence and miniaturized thermal imaging systems represent a new opportunity for wildlife experts to inexpensively survey relatively large areas. The system presented in this paper includes thermal image acquisition as well as a video processing pipeline to perform object detection, classification and tracking of wildlife in forest or open areas. The system is tested on thermal video data from ground based and test flight footage, and is found to be able to detect all the target wildlife located in the surveyed area. The system is flexible in that the user can readily define the types of objects to classify and the object characteristics that should be considered during classification.
Resumo:
- Objective To compare health service cost and length of stay between a traditional and an accelerated diagnostic approach to assess acute coronary syndromes (ACS) among patients who presented to the emergency department (ED) of a large tertiary hospital in Australia. - Design, setting and participants This historically controlled study analysed data collected from two independent patient cohorts presenting to the ED with potential ACS. The first cohort of 938 patients was recruited in 2008–2010, and these patients were assessed using the traditional diagnostic approach detailed in the national guideline. The second cohort of 921 patients was recruited in 2011–2013 and was assessed with the accelerated diagnostic approach named the Brisbane protocol. The Brisbane protocol applied early serial troponin testing for patients at 0 and 2 h after presentation to ED, in comparison with 0 and 6 h testing in traditional assessment process. The Brisbane protocol also defined a low-risk group of patients in whom no objective testing was performed. A decision tree model was used to compare the expected cost and length of stay in hospital between two approaches. Probabilistic sensitivity analysis was used to account for model uncertainty. - Results Compared with the traditional diagnostic approach, the Brisbane protocol was associated with reduced expected cost of $1229 (95% CI −$1266 to $5122) and reduced expected length of stay of 26 h (95% CI −14 to 136 h). The Brisbane protocol allowed physicians to discharge a higher proportion of low-risk and intermediate-risk patients from ED within 4 h (72% vs 51%). Results from sensitivity analysis suggested the Brisbane protocol had a high chance of being cost-saving and time-saving. - Conclusions This study provides some evidence of cost savings from a decision to adopt the Brisbane protocol. Benefits would arise for the hospital and for patients and their families.
Resumo:
This study identified the areas of poor specificity in national injury hospitalization data and the areas of improvement and deterioration in specificity over time. A descriptive analysis of ten years of national hospital discharge data for Australia from July 2002-June 2012 was performed. Proportions and percentage change of defined/undefined codes over time was examined. At the intent block level, accidents and assault were the most poorly defined with over 11% undefined in each block. The mechanism blocks for accidents showed a significant deterioration in specificity over time with up to 20% more undefined codes in some mechanisms. Place and activity were poorly defined at the broad block level (43% and 72% undefined respectively). Private hospitals and hospitals in very remote locations recorded the highest proportion of undefined codes. Those aged over 60 years and females had the higher proportion of undefined code usage. This study has identified significant, and worsening, deficiencies in the specificity of coded injury data in several areas. Focal attention is needed to improve the quality of injury data, especially on those identified in this study, to provide the evidence base needed to address the significant burden of injury in the Australian community.
Resumo:
Objective Foodborne illnesses in Australia, including salmonellosis, are estimated to cost over $A1.25 billion annually. The weather has been identified as being influential on salmonellosis incidence, as cases increase during summer, however time series modelling of salmonellosis is challenging because outbreaks cause strong autocorrelation. This study assesses whether switching models is an improved method of estimating weather–salmonellosis associations. Design We analysed weather and salmonellosis in South-East Queensland between 2004 and 2013 using 2 common regression models and a switching model, each with 21-day lags for temperature and precipitation. Results The switching model best fit the data, as judged by its substantial improvement in deviance information criterion over the regression models, less autocorrelated residuals and control of seasonality. The switching model estimated a 5°C increase in mean temperature and 10 mm precipitation were associated with increases in salmonellosis cases of 45.4% (95% CrI 40.4%, 50.5%) and 24.1% (95% CrI 17.0%, 31.6%), respectively. Conclusions Switching models improve on traditional time series models in quantifying weather–salmonellosis associations. A better understanding of how temperature and precipitation influence salmonellosis may identify where interventions can be made to lower the health and economic costs of salmonellosis.
Resumo:
- Introduction Malaria cases have dwindled in Bhutan with aim of malaria elimination by 2016. The aims of this study are to determine the trends and burden of malaria, the costs of intensified control activities, the main donors of the control activities and the costs of different preventive measures in the pre-elimination phase (2006-2014). - Methods A descriptive analysis of malaria surveillance data from 2006-2014 was carried out, using data from the Vector-borne Disease Control Programme (VDCP), Bhutan. Malaria morbidity and mortality among local Bhutanese and foreign nationals were analysed. The cost of different control and preventive measures, and estimation of the average numbers of long-lasting insecticidal nests (LLINs) per person were calculated. - Findings There were 5,491 confirmed malaria cases from 2006 to 2014. By 2013, there was an average of one LLIN for every 1·51 individuals. The Global Fund was the main international donor accounting for > 80% of the total funds. The cost of procuring LLINs accounted for > 90% of the total cost of prevention measures. - Interpretation The malaria burden reduced significantly over the study period with high coverage of LLINs in Bhutan. This foreseeable challenges that require national attention to maintain malaria-free status after elimination are importation of malaria, particularly from India; continued protection of the population in endemic districts through complete coverage with LLINs and IRS; and exploration of local funding modalities post elimination in the event there is a reduction in international funding.
Resumo:
Background and Aims Considerable variation has been documented with fleet safety interventions’ abilities to create lasting behavioural change, and research has neglected to consider employees’ perceptions regarding the effectiveness of fleet interventions. This is a critical oversight as employees’ beliefs and acceptance levels (as well as the perceived organisational commitment to safety) can ultimately influence levels of effectiveness, and this study aimed to examine such perceptions in Australian fleet settings. Method 679 employees sourced from four Australian organisations completed a safety climate questionnaire as well as provided perspectives about the effectiveness of 35 different safety initiatives. Results Countermeasures that were perceived as most effective were a mix of human and engineering-based approaches: - (a) purchasing safer vehicles; - (b) investigating serious vehicle incidents, and; - (c) practical driver skills training. In contrast, least effective countermeasures were considered to be: - (a) signing a promise card; - (b) advertising a company’s phone number on the back of cars for complaints and compliments, and; - (c) communicating cost benefits of road safety to employees. No significant differences in employee perceptions were identified based on age, gender, employees’ self-reported crash involvement or employees’ self-reported traffic infringement history. Perceptions of safety climate were identified to be “moderate” but were not linked to self-reported crash or traffic infringement history. However, higher levels of safety climate were positively correlated with perceived effectiveness of some interventions. Conclusion Taken together, employees believed occupational road safety risks could best be managed by the employer by implementing a combination of engineering and human resource initiatives to enhance road safety. This paper will further outline the key findings in regards to practice as well as provide direction for future research.
Resumo:
Background Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. Methods The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May–September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. Results The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Conclusions Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents’ safety.
Resumo:
Background The Global Burden of Diseases (GBD), Injuries, and Risk Factors study used the disability-adjusted life year (DALY) to quantify the burden of diseases, injuries, and risk factors. This paper provides an overview of injury estimates from the 2013 update of GBD, with detailed information on incidence, mortality, DALYs and rates of change from 1990 to 2013 for 26 causes of injury, globally, by region and by country. Methods Injury mortality was estimated using the extensive GBD mortality database, corrections for ill-defined cause of death and the cause of death ensemble modelling tool. Morbidity estimation was based on inpatient and outpatient data sets, 26 cause-of-injury and 47 nature-of-injury categories, and seven follow-up studies with patient-reported long-term outcome measures. Results In 2013, 973 million (uncertainty interval (UI) 942 to 993) people sustained injuries that warranted some type of healthcare and 4.8 million (UI 4.5 to 5.1) people died from injuries. Between 1990 and 2013 the global age-standardised injury DALY rate decreased by 31% (UI 26% to 35%). The rate of decline in DALY rates was significant for 22 cause-of-injury categories, including all the major injuries. Conclusions Injuries continue to be an important cause of morbidity and mortality in the developed and developing world. The decline in rates for almost all injuries is so prominent that it warrants a general statement that the world is becoming a safer place to live in. However, the patterns vary widely by cause, age, sex, region and time and there are still large improvements that need to be made.
Resumo:
The intermittently rivet fastened Rectangular Hollow Flange Channel Beam (RHFCB) is a new cold-formed hollow section proposed as an alternative to welded hollow flange channel beams. It is a monosymmetric channel section made by intermittently rivet fastening two torsionally rigid rectangular hollow flanges to a web plate. This process enables the end users to choose an effective combination of different web and flange plate sizes to achieve optimum design capacities. Recent research studies focused mainly on the shear behaviour of the most commonly used lipped channel beam and welded hollow flange beam sections. However, the shear behaviour of rivet fastened RHFCB has not been investigated. Therefore a detailed experimental study involving 24 shear tests was undertaken to investigate the shear behaviour and capacities of rivet fastened RHFCBs. Simply supported test specimens of RHFCB with aspect ratios of 1.0 and 1.5 were loaded at mid-span until failure. Comparison of experimental shear capacities with corresponding predictions from the current Australian cold-formed steel design rules showed that the current design rules are very conservative for the shear design of rivet fastened RHFCBs. Significant improvements to web shear buckling occurred due to the presence of rectangular hollow flanges while considerable post-buckling strength was also observed. Such enhancements to the shear behaviour and capacity were achieved with a rivet spacing of 100 mm. Improved design rules were proposed for rivet fastened RHFCBs based on the current shear design equations in AISI S100 and the direct strength method. This paper presents the details of this experimental investigation and the results.
Resumo:
The modern diet has become highly sweetened, resulting in unprecedented levels of sugar consumption, particularly among adolescents. While chronic long-term sugar intake is known to contribute to the development of metabolic disorders including obesity and type II diabetes, little is known regarding the direct consequences of long-term, binge-like sugar consumption on the brain. Because sugar can cause the release of dopamine in the nucleus accumbens (NAc) similarly to drugs of abuse, we investigated changes in the morphology of neurons in this brain region following short- (4 weeks) and long-term (12 weeks) binge-like sucrose consumption using an intermittent two-bottle choice paradigm. We used Golgi-Cox staining to impregnate medium spiny neurons (MSNs) from the NAc core and shell of short- and long-term sucrose consuming rats and compared these to age-matched water controls. We show that prolonged binge-like sucrose consumption significantly decreased the total dendritic length of NAc shell MSNs compared to age-matched control rats. We also found that the restructuring of these neurons resulted primarily from reduced distal dendritic complexity. Conversely, we observed increased spine densities at the distal branch orders of NAc shell MSNs from long-term sucrose consuming rats. Combined, these results highlight the neuronal effects of prolonged binge-like intake of sucrose on NAc shell MSN morphology.
Resumo:
Background Despite evidence from overseas that certification and credentialing of infection control professionals (ICPs) is important to patient outcomes, there are no standardized requirements for the education and preparation of ICPs in Australia. A credentialing process (now managed by the Australasian College of Infection Prevention and Control) has been in existence since 2000; however, no evaluation has occurred. Methods A cross-sectional study design was used to identify the perceived barriers to credentialing and the characteristics of credentialed ICPs. Results There were 300 responses received; 45 (15%) of participants were credentialed. Noncredentialed ICPs identified barriers to credentialing as no employer requirement and no associated remuneration. Generally credentialed ICPs were more likely to hold higher degrees and have more infection control experience than their noncredentialed colleagues. Conclusions The credentialing process itself may assist in supporting ICP development by providing an opportunity for reflection and feedback from peer review. Further, the process may assist ICPs in being flexible and adaptable to the challenging and ever-changing environment that is infection control.
Resumo:
Objective: To systematically review studies reporting the prevalence in general adult inpatient populations of foot disease disorders (foot wounds, foot infections, collective ‘foot disease’) and risk factors (peripheral arterial disease (PAD), peripheral neuropathy (PN), foot deformity). Methods: A systematic review of studies published between 1980 and 2013 was undertaken using electronic databases (MEDLINE, EMBASE and CINAHL). Keywords and synonyms relating to prevalence, inpatients, foot disease disorders and risk factors were used. Studies reporting foot disease or risk factor prevalence data in general inpatient populations were included. Included study's reference lists and citations were searched and experts consulted to identify additional relevant studies. 2 authors, blinded to each other, assessed the methodological quality of included studies. Applicable data were extracted by 1 author and checked by a second author. Prevalence proportions and SEs were calculated for all included studies. Pooled prevalence estimates were calculated using random-effects models where 3 eligible studies were available. Results: Of the 4972 studies initially identified, 78 studies reporting 84 different cohorts (total 60 231 517 participants) were included. Foot disease prevalence included: foot wounds 0.01–13.5% (70 cohorts), foot infections 0.05–6.4% (7 cohorts), collective foot disease 0.2–11.9% (12 cohorts). Risk factor prevalence included: PAD 0.01–36.0% (10 cohorts), PN 0.003–2.8% (6 cohorts), foot deformity was not reported. Pooled prevalence estimates were only able to be calculated for pressure ulcer-related foot wounds 4.6% (95% CI 3.7% to 5.4%)), diabetes-related foot wounds 2.4% (1.5% to 3.4%), diabetes-related foot infections 3.4% (0.2% to 6.5%), diabetes-related foot disease 4.7% (0.3% to 9.2%). Heterogeneity was high in all pooled estimates (I2=94.2–97.8%, p<0.001). Conclusions: This review found high heterogeneity, yet suggests foot disease was present in 1 in every 20 inpatients and a major risk factor in 1 in 3 inpatients. These findings are likely an underestimate and more robust studies are required to provide more precise estimates.
Resumo:
Quantification of pyridoxal-5´-phosphate (PLP) in biological samples is challenging due to the presence of endogenous PLP in matrices used for preparation of calibrators and quality control samples (QCs). Hence, we have developed an LC-MS/MS method for accurate and precise measurement of the concentrations of PLP in samples (20 µL) of human whole blood that addresses this issue by using a surrogate matrix and minimizing the matrix effect. We used a surrogate matrix comprising 2% bovine serum albumin (BSA) in phosphate buffer saline (PBS) for making calibrators, QCs and the concentrations were adjusted to include the endogenous PLP concentrations in the surrogate matrix according to the method of standard addition. PLP was separated from the other components of the sample matrix using protein precipitation with trichloroacetic acid 10% w/v. After centrifugation, supernatant were injected directly into the LC-MS/MS system. Calibration curves were linear and recovery was > 92%. QCs were accurate, precise, stable for four freeze-thaw cycles, and following storage at room temperature for 17h or at -80 °C for 3 months. There was no significant matrix effect using 9 different individual human blood samples. Our novel LC-MS/MS method has satisfied all of the criteria specified in the 2012 EMEA guideline on bioanalytical method validation.
Resumo:
Background Breast cancer (BC) is primarily considered a genetic disorder with a complex interplay of factors including age, gender, ethnicity, family history, personal history and lifestyle with associated hormonal and non-hormonal risk factors. The SNP rs2910164 in miR146a (a G to C polymorphism) was previously associated with increased risk of BC in cases with at least a single copy of the C allele in breast cancer, though results in other cancers and populations have shown significant variation. Methods In this study, we examined this SNP in an Australian sporadic breast cancer population of 160 cases and matched controls, with a replicate population of 403 breast cancer cases using High Resolution Melting. Results Our analysis indicated that the rs2910164 polymorphism is associated with breast cancer risk in both primary and replicate populations (p = 0.03 and 0.0013, respectively). In contrast to the results of familial breast cancer studies, however, we found that the presence of the G allele of rs2910164 is associated with increased cancer risk, with an OR of 1.77 (95% CI 1.40–2.23). Conclusions The microRNA miR146a has a potential role in the development of breast cancer and the effects of its SNPs require further inquiry to determine the nature of their influence on breast tissue and cancer.