458 resultados para penalized likelihood
Resumo:
Hamstring strain injuries (HSIs) are the most prevalent injury in a number of sports, and while anterior cruciate ligament (ACL) injuries are less common, they are far more severe and have long-term implications, such as an increased risk of developing osteoarthritis later in life. Given the high incidence and severity of these injuries, they are key targets of injury preventive programs in elite sport. Evidence has shown that a previous severe knee injury (including ACL injury) increases the risk of HSI; however, whether the functional deficits that occur after HSI result in an increased risk of ACL injury has yet to be considered. In this clinical commentary, we present evidence that suggests that the link between previous HSI and increased risk of ACL injury requires further investigation by drawing parallels between deficits in hamstring function after HSI and in women athletes, who are more prone to ACL injury than men athletes. Comparisons between the neuromuscular function of the male and female hamstring has shown that women display lower hamstring-to-quadriceps strength ratios during isokinetic knee flexion and extension, increased activation of the quadriceps compared with the hamstrings during a stop-jump landing task, a greater time required to reach maximal isokinetic hamstring torque, and lower integrated myoelectrical hamstring activity during a sidestep cutting maneuver. Somewhat similarly, in athletes with a history of HSI, the previously injured limb, compared with the uninjured limb, displays lower eccentric knee flexor strength, a lower hamstrings-to-quadriceps strength ratio, lower voluntary myoelectrical activity during maximal knee flexor eccentric contraction, a lower knee flexor eccentric rate of torque development, and lower voluntary myoelectrical activity during the initial portion of eccentric contraction. Given that the medial and lateral hamstrings have different actions at the knee joint in the coronal plane, which hamstring head is previously injured might also be expected to influence the likelihood of future ACL. Whether the deficits in function after HSI, as seen in laboratory-based studies, translate to deficits in hamstring function during typical injurious tasks for ACL injury has yet to be determined but should be a consideration for future work.
Resumo:
The estimation of the critical gap has been an issue since the 1970s, when gap acceptance was introduced to evaluate the capacity of unsignalized intersections. The critical gap is the shortest gap that a driver is assumed to accept. A driver’s critical gap cannot be measured directly and a number of techniques have been developed to estimate the mean critical gaps of a sample of drivers. This paper reviews the ability of the Maximum Likelihood technique and the Probability Equilibrium Method to predict the mean and standard deviation of the critical gap with a simulation of 100 drivers, repeated 100 times for each flow condition. The Maximum Likelihood method gave consistent and unbiased estimates of the mean critical gap. Whereas the probability equilibrium method had a significant bias that was dependent on the flow in the priority stream. Both methods were reasonably consistent, although the Maximum Likelihood Method was slightly better. If drivers are inconsistent, then again the Maximum Likelihood method is superior. A criticism levelled at the Maximum Likelihood method is that a distribution of the critical gap has to be assumed. It was shown that this does not significantly affect its ability to predict the mean and standard deviation of the critical gaps. Finally, the Maximum Likelihood method can predict reasonable estimates with observations for 25 to 30 drivers. A spreadsheet procedure for using the Maximum Likelihood method is provided in this paper. The PEM can be improved if the maximum rejected gap is used.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Local spatio-temporal features with a Bag-of-visual words model is a popular approach used in human action recognition. Bag-of-features methods suffer from several challenges such as extracting appropriate appearance and motion features from videos, converting extracted features appropriate for classification and designing a suitable classification framework. In this paper we address the problem of efficiently representing the extracted features for classification to improve the overall performance. We introduce two generative supervised topic models, maximum entropy discrimination LDA (MedLDA) and class- specific simplex LDA (css-LDA), to encode the raw features suitable for discriminative SVM based classification. Unsupervised LDA models disconnect topic discovery from the classification task, hence yield poor results compared to the baseline Bag-of-words framework. On the other hand supervised LDA techniques learn the topic structure by considering the class labels and improve the recognition accuracy significantly. MedLDA maximizes likelihood and within class margins using max-margin techniques and yields a sparse highly discriminative topic structure; while in css-LDA separate class specific topics are learned instead of common set of topics across the entire dataset. In our representation first topics are learned and then each video is represented as a topic proportion vector, i.e. it can be comparable to a histogram of topics. Finally SVM classification is done on the learned topic proportion vector. We demonstrate the efficiency of the above two representation techniques through the experiments carried out in two popular datasets. Experimental results demonstrate significantly improved performance compared to the baseline Bag-of-features framework which uses kmeans to construct histogram of words from the feature vectors.
Resumo:
Alignment-free methods, in which shared properties of sub-sequences (e.g. identity or match length) are extracted and used to compute a distance matrix, have recently been explored for phylogenetic inference. However, the scalability and robustness of these methods to key evolutionary processes remain to be investigated. Here, using simulated sequence sets of various sizes in both nucleotides and amino acids, we systematically assess the accuracy of phylogenetic inference using an alignment-free approach, based on D2 statistics, under different evolutionary scenarios. We find that compared to a multiple sequence alignment approach, D2 methods are more robust against among-site rate heterogeneity, compositional biases, genetic rearrangements and insertions/deletions, but are more sensitive to recent sequence divergence and sequence truncation. Across diverse empirical datasets, the alignment-free methods perform well for sequences sharing low divergence, at greater computation speed. Our findings provide strong evidence for the scalability and the potential use of alignment-free methods in large-scale phylogenomics.
Resumo:
An estimated A$75,000 is lost by Australians everyday to online fraud, according to the Australian Competition and Consumer Commission (ACCC). Given that this is based on reported crime, the real figure is likely to be much higher. It is well known that fraud, particularly online fraud, has a very low reporting rate. This also doesn’t even begin to encompass non-financial costs to victims. The real cost is likely to be much, much higher. There are many challenges to policing this type of crime, and victims who send money to overseas jurisdictions make it even harder, as does the likelihood of offenders creating false identities or simply stealing legitimate ones. But despite these challenges police have started to do something to prevent the impact and losses of online fraud. By accessing financial intelligence, police are able to identify individuals who are sending money to known high-risk countries for fraud. They then notify these people with their suspicions that they may be involved in fraud. In many cases the people don’t even know they may be victims or involved in online fraud.
Resumo:
A sizeable (and growing) proportion of the public in Western democracies deny the existence of anthropogenic climate change. It is commonly assumed that convincing deniers that climate change is real is necessary for them to act pro-environmentally. However, the likelihood of ‘conversion’ using scientific evidence is limited because these attitudes increasingly reflect ideological positions. An alternative approach is to identify outcomes of mitigation efforts that deniers find important. People have strong interests in the welfare of their society, so deniers may act in ways supporting mitigation efforts where they believe these efforts will have positive societal effects. In Study 1, climate change deniers (N D 155) intended to act more pro-environmentally where they thought climate change action would create a society where people are more considerate and caring, and where there is greater economic/technological development. Study 2 (ND347) replicated this experimentally, showing that framing climate change action as increasing consideration for others, or improving economic/technological development, led to greater pro-environmental action intentions than a frame emphasizing avoiding the risks of climate change. To motivate deniers’ pro-environmental actions, communication should focus on how mitigation efforts can promote a better society, rather than focusing on the reality of climate change and averting its risks.
Resumo:
Background The concept spirituality appears to be gaining increasing attention for its potential relationship to mental health, despite there being an absence of consensus on what spirituality is or whether it can be distinguished from religion (or religiousness) in operational terms. Spirituality is a term that is embraced within secular and non-secular contexts alike. As a consequence, spirituality as a concept encompasses forms of religiosity that are embedded in traditional religion and those that have little or no connection to traditional religious teachings. The emergence of religious/spiritual beliefs that depart from traditional religious thought represents one key feature of widespread religious change in contemporary societies. Non-traditional religious/spiritual beliefs need to be viewed within this context and thus be differentiated from traditional religious/spiritual beliefs when investigating connections between religion, spirituality, and mental health. Aims The current study seeks to compare the mental health of those whose beliefs are rooted in religious tradition with those whose beliefs deviate from traditional religious thought. The two main objectives of this study are: (1) to determine the extent to which religious background predicts endorsement of traditional and non-traditional religious/spiritual beliefs and church attendance in young adulthood, and; (2) to determine whether differential relationships exist between current religiosity, religious background, and mental health in young adulthood, and whether any observed differences are attributable to other characteristics of respondents like sociodemographic factors and health-risk behaviours. Methods Data were derived from the Mater-University of Queensland Study of Pregnancy, a longitudinal, prospective study of maternal and child health from the prenatal period to 21 years post-delivery. Religiosity was assessed among the study children in young adulthood from three items measured at the time of the 21-year follow-up. Religious background was assessed from information provided by the study mothers in earlier phases of the study. Young adult responses to items included in the Young Adult Self Report (Achenbach, 1997) were used to assess cases of anxiety/depression and externalising behaviour, and delusional ideation was assessed from their responses to the 21-item Peters et al. Delusions Inventory (PDI) (Peters & Garety, 1996). Results Belief in a spiritual or higher power other than God was found to be positively related to anxiety/depression, disturbed ideation, suspiciousness and paranormal ideation, high total PDI scores, as well as antisocial behaviour in young adulthood, regardless of gender. These associations persisted after adjustment for potential confounders. By contrast, young adults who maintain a traditional belief in God appear to be no different to those who reject this belief in regard to anxiety/depression. Belief in God was found to have no association with antisocial behaviour for males, but was observed to have a weak negative relationship with antisocial behaviour for females. This association failed to reach statistical significance however, after adjustment for other religious/spiritual and social characteristics. No associations were found between young adult belief in God and disturbed, suspicious or paranormal ideation, although a positive relationship was identified for high total PDI scores. Weekly church attendance was observed to reduce the likelihood of antisocial behaviour in young adulthood among males, but not females. Religious ideation was found to more prevalent among young adults who attend church on either a weekly or infrequent basis. No long-term effects on anxiety/depression or antisocial behaviour were evident from maternal belief in God, church attendance or religious affiliation in the young adults’ early lives. However, maternal church attendance predicted religious ideation in young adulthood. Offspring of mothers affiliated with a Pentecostal church in the prenatal period appear to have a high rate of religious ideation and high total PDI scores. Paranormal ideation in young adulthood appears to have no association with maternal religiosity in a young adult’s early life. Conclusion The findings from this study suggest that young adults who endorse non-traditional religious/spiritual beliefs are at greater risk for poorer mental health and aberrant social behaviour than those who reject these beliefs. These results suggest that a non-traditional religious/spiritual belief system involves more than mere rejection of traditional religious doctrine. This system of belief may be a marker for those who question the legitimacy of established societal norms and values, and whose thoughts, attitudes and actions reflect this position. This possibility has implications for mental health and wellbeing at both an individual and a societal level and warrants further research attention.
Resumo:
Purpose To investigate the frequency of convergence and accommodation anomalies in an optometric clinical setting in Mashhad, Iran, and to determine tests with highest accuracy in diagnosing these anomalies. Methods From 261 patients who came to the optometric clinics of Mashhad University of Medical Sciences during a month, 83 of them were included in the study based on the inclusion criteria. Near point of convergence (NPC), near and distance heterophoria, monocular and binocular accommodative facility (MAF and BAF, respectively), lag of accommodation, positive and negative fusional vergences (PFV and NFV, respectively), AC/A ratio, relative accommodation, and amplitude of accommodation (AA) were measured to diagnose the convergence and accommodation anomalies. The results were also compared between symptomatic and asymptomatic patients. The accuracy of these tests was explored using sensitivity (S), specificity (Sp), and positive and negative likelihood ratios (LR+, LR−). Results Mean age of the patients was 21.3 ± 3.5 years and 14.5% of them had specific binocular and accommodative symptoms. Convergence and accommodative anomalies were found in 19.3% of the patients; accommodative excess (4.8%) and convergence insufficiency (3.6%) were the most common accommodative and convergence disorders, respectively. Symptomatic patients showed lower values for BAF (p = .003), MAF (p = .001), as well as AA (p = .001) compared with asymptomatic patients. Moreover, BAF (S = 75%, Sp = 62%) and MAF (S = 62%, Sp = 89%) were the most accurate tests for detecting accommodative and convergence disorders in terms of both sensitivity and specificity. Conclusions Convergence and accommodative anomalies are the most common binocular disorders in optometric patients. Including tests of monocular and binocular accommodative facility in routine eye examinations as accurate tests to diagnose these anomalies requires further investigation.
Resumo:
Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.
Resumo:
Introduction Patients with dysphagia (PWDs) have been shown to be four times more likely to suffer medication administration errors (MAEs).1 2 Individualised medication administration guides (I-MAGs) which outline how each formulation should be administered, have been developed to standardise medication administration by nurses on the ward and reduce the likelihood of errors. This pilot study aimed to determine the recruitment rates, estimate effect on errors and develop the intervention to design a future full scale randomised controlled trial to determine the costs and effects of I-MAG implementation. Ethical approval was granted by local ethics committee. Method Software was developed to enable I-MAG production (based on current best practice)3 4 for all PWDs on two care of the older person wards admitted during a six month period from January to July 2011. I-MAGs were attached to the medication administration record charts to be utilised by nurses when administering medicines. Staff training was provided for all staff on the intervention wards. Two care of the older person wards in the same hospital were used for control purposes. All patients with dysphagia were recruited for follow up purposes at discharge. Four ward rounds at each intervention and control ward were observed pre and post I-MAG implementation to determine the level of medication administration errors. NHS ethical approval for the study was obtained. Results 164 I-MAGs were provided for 75 patients with dysphagia (PWDs) in the two intervention wards. At discharge, 23 patients in the intervention wards and 7 patients in the control wards were approached for recruitment of which 17 (74%) & 5 (71.5%) respectively consented. Discussion Recruitment rates were low on discharge due to the dysphagia remitting during hospitalisation. The introduction of the I-MAG demonstrated no effect on the quality of administration on the intervention ward and interestingly practice improved on the control ward. The observation of medication rounds at least one month post I-MAG removal may have identified a reversal to normal practice and ideally observations should have been undertaken with I-MAGs in place. Identification of the reason for the improvement in the control ward is warranted.
Resumo:
The Australian water sector needs to adapt to effectively deal with the impacts of climate change on its systems. Challenges as a result of climate change include increasingly extreme occurrences of weather events including flooding and droughts (Pittock, 2011). In response to such challenges, the National Water Commission in Australia has identified the need for the water sector to transition towards being readily adaptable and able to respond to complex needs for a variety of supply and demand scenarios (National Water Commission, 2013). To successfully make this transition, the sector will need to move away from business as usual, and proactively pursue and adopt innovative approaches and technologies as a means to successfully address the impacts of climate change on the Australian water sector. In order to effectively respond to specific innovation challenges related to the sector, including climate change, it is first necessary to possess a foundational understanding about the key elements related to innovation in the sector. This paper presents this base level understanding, identifying the key barriers, drivers and enablers, and elements for innovative practise in the water sector. After initially inspecting the literature around the challenges stemming from climate change faced by the sector, the paper then examines the findings from the initial two rounds of a modified Delphi study, conducted with experts from the Australian water sector, including participants from research, government and industry backgrounds. The key barriers, drivers and enablers for innovation in the sector identified during the initial phase of the study formed the basis for the remainder of the investigation. Key elements investigated were: barriers – scepticism, regulation systems, inconsistent policy; drivers – influence of policy, resource scarcity, thought leadership; enablers – framing the problem, effective regulations, community acceptance. There is a convincing argument for the water sector transitioning to a more flexible, adaptive and responsive system in the face of challenges resulting from climate change. However, without first understanding the challenges and opportunities around making this transition, the likelihood of success is limited. For that reason, this paper takes the first step in understanding the elements surrounding innovation in the Australian water sector.
Resumo:
This thesis provides new knowledge on an understudied group of grasses, some of which are resurrection grasses (i.e. able to withstand extreme drought). The sole Australian species (Tripogon loliiformis) is morphologically diverse and could be more than one species. This study sought to determine how many species of Tripogon occur in Australia, their relationships to other species in the genus and to two other genera of resurrection grasses (Eragrostiella and Oropetium). Results of the research indicate there is not enough evidence, from DNA sequence data, to warrant splitting up T. loliiformis into multiple species. The extensive morphological diversity seems to be influenced by environmental conditions. The three genera are so closely related that they could be grouped into a single genus. This new knowledge opens up pathways for future investigations, including studying genes responsible for desiccation tolerance and the conservation of native grasses that occur in rocky habitats.
Resumo:
Background Currently, care providers and policy-makers internationally are working to promote normal birth. In Australia, such initiatives are being implemented without any evidence of the prevalence or determinants of normal birth as a multidimensional construct. This study aimed to better understand the determinants of normal birth (defined as without induction of labour, epidural/spinal/general anaesthesia, forceps/vacuum, caesarean birth, or episiotomy) using secondary analyses of data from a population survey of women in Queensland, Australia. Methods Women who birthed in Queensland during a two-week period in 2009 were mailed a survey approximately three months after birth. Women (n=772) provided retrospective data on their pregnancy, labour and birth preferences and experiences, socio-demographic characteristics, and reproductive history. A series of logistic regressions were conducted to determine factors associated with having labour, having a vaginal birth, and having a normal birth. Findings Overall, 81.9% of women had labour, 66.4% had a vaginal birth, and 29.6% had a normal birth. After adjusting for other significant factors, women had significantly higher odds of having labour if they birthed in a public hospital and had a pre-existing preference for a vaginal birth. Of women who had labour, 80.8% had a vaginal birth. Women who had labour had significantly higher odds of having a vaginal birth if they attended antenatal classes, did not have continuous fetal monitoring, felt able to ‘take their time’ in labour, and had a pre-existing preference for a vaginal birth. Of women who had a vaginal birth, 44.7% had a normal birth. Women who had a vaginal birth had significantly higher odds of having a normal birth if they birthed in a public hospital, birthed outside regular business hours, had mobility in labour, did not have continuous fetal monitoring, and were non-supine during birth. Conclusions These findings provide a strong foundation on which to base resources aimed at increasing informed decision-making for maternity care consumers, providers, and policy-makers alike. Research to evaluate the impact of modifying key clinical practices (e.g., supporting women׳s mobility during labour, facilitating non-supine positioning during birth) on the likelihood of a normal birth is an important next step.
Resumo:
In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.