904 resultados para risk theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Over recent years there has been an increase in the literature examining youth with Autism Spectrum Disorders (ASD). The growth in this area of research has highlighted a significant gap in our understanding of suitable interventions for people with ASD and the treatment of co-occurring psychiatric disorders.1-3 Children with ASD are at increased risk of experiencing depressive symptoms and developing depression; however with very few proven interventions available for preventing and treating depression in children with ASD, there is a need for further research in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of governments on increasing active travel has motivated renewed interest in cycling safety. Bicyclists are up to 20 times more likely to be involved in serious injury crashes than drivers so understanding the relationship among factors in bicyclist crash risk is critically important for identifying effective policy tools, for informing bicycle infrastructure investments, and for identifying high risk bicycling contexts. This study aims to better understand the complex relationships between bicyclist self reported injuries resulting from crashes (e.g. hitting a car) and non-crashes (e.g. spraining an ankle) and perceived risk of cycling as a function of cyclist exposure, rider conspicuity, riding environment, rider risk aversion, and rider ability. Self reported data from 2,500 Queensland cyclists are used to estimate a series of seemingly unrelated regressions to examine the relationships among factors. The major findings suggest that perceived risk does not appear to influence injury rates, nor do injury rates influence perceived risks of cycling. Riders who perceive cycling as risky tend not to be commuters, do not engage in group riding, tend to always wear mandatory helmets and front lights, and lower their perception of risk by increasing days per week of riding and by increasing riding proportion on bicycle paths. Riders who always wear helmets have lower crash injury risk. Increasing the number of days per week riding tends to decrease both crash injury and non crash injury risk (e.g. a sprain). Further work is needed to replicate some of the findings in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: Ecological studies have suggested an inverse relationship between latitude and risks of some cancers. However, associations between solar ultraviolet radiation (UVR) exposure and esophageal cancer risk have not been fully explored. We therefore investigated the association between nevi, freckles, and measures of ambient UVR over the life-course with risks of esophageal cancers. METHODS: We compared estimated lifetime residential ambient UVR among Australian patients with esophageal cancer (330 esophageal adenocarcinoma (EAC), 386 esophago-gastric junction adenocarcinoma (EGJAC), and 279 esophageal squamous cell carcinoma (ESCC)), and 1471 population controls. We asked people where they had lived at different periods of their life, and assigned ambient UVR to each location based on measurements from NASA's Total Ozone Mapping Spectrometer database. Freckling and nevus burden were self-reported. We used multivariable logistic regression models to estimate the magnitude of associations between phenotype, ambient UVR, and esophageal cancer risk. RESULTS: Compared with population controls, patients with EAC and EGJAC were less likely to have high levels of estimated cumulative lifetime ambient UVR (EAC odds ratio (OR) 0.59, 95% confidence interval (CI) 0.35-0.99, EGJAC OR 0.55, 0.34-0.90). We found no association between UVR and risk of ESCC (OR 0.91, 0.51-1.64). The associations were independent of age, sex, body mass index, education, state of recruitment, frequency of reflux, smoking status, alcohol consumption, and H. pylori serostatus. Cases with EAC were also significantly less likely to report high levels of nevi than controls. CONCLUSIONS: These data show an inverse association between ambient solar UVR at residential locations and risk of EAC and EGJAC, but not ESCC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Associations between single nucleotide polymorphisms (SNPs) at 5p15 and multiple cancer types have been reported. We have previously shown evidence for a strong association between prostate cancer (PrCa) risk and rs2242652 at 5p15, intronic in the telomerase reverse transcriptase (TERT) gene that encodes TERT. To comprehensively evaluate the association between genetic variation across this region and PrCa, we performed a fine-mapping analysis by genotyping 134 SNPs using a custom Illumina iSelect array or Sequenom MassArray iPlex, followed by imputation of 1094 SNPs in 22 301 PrCa cases and 22 320 controls in The PRACTICAL consortium. Multiple stepwise logistic regression analysis identified four signals in the promoter or intronic regions of TERT that independently associated with PrCa risk. Gene expression analysis of normal prostate tissue showed evidence that SNPs within one of these regions also associated with TERT expression, providing a potential mechanism for predisposition to disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advanced programmatic risk analysis and management model (APRAM) is one of the recently developed methods that can be used for risk analysis and management purposes considering schedule, cost, and quality risks simultaneously. However, this model considers those failure risks that occur only over the design and construction phases of a project’s life cycle. While it can be sufficient for some projects for which the required cost during the operating life is much less than the budget required over the construction period, it should be modified in relation to infrastructure projects because the associated costs during the operating life cycle are significant. In this paper, a modified APRAM is proposed, which can consider potential risks that might occur over the entire life cycle of the project, including technical and managerial failure risks. Therefore, the modified model can be used as an efficient decision-support tool for construction managers in the housing industry in which various alternatives might be technically available. The modified method is demonstrated by using a real building project, and this demonstration shows that it can be employed efficiently by construction managers. The Delphi method was applied in order to figure out the failure events and their associated probabilities. The results show that although the initial cost of a cold-formed steel structural system is higher than a conventional construction system, the former’s failure cost is much lower than the latter’s

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lankes and Silverstein (2006) introduced the “participatory library” and suggested that the nature and form of the library should be explored. In the last several years, some attempts have been made in order to develop contemporary library models that are often known as Library 2.0. However, little research has been based on empirical data and such models have had a strong focus on technical aspects but less focus on participation. The research presented in this paper fills this gap. A grounded theory approach was adopted for this study. Six librarians were involved in in-depth individual interviews. As a preliminary result, five main factors of the participatory library emerged including technological, human, educational, social-economic, and environmental. Five factors influencing the participation in libraries were also identified: finance, technology, education, awareness, and policy. The study’s findings provide a fresh perspective on contemporary library and create a basis for further studies on this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the claims of superiority of fundamental indexation strategy over capitalisation-weighted indexation by using data for Australian Securities Exchange (ASX) listed stocks. Whilst our results are in line with the outperformance observed in other geographical markets, we find that the excess returns from fundamental indexation in Australian market are much higher. On a rolling 5-year basis, the fundamental index always outperforms the capitalisation-weighted index. Our results suggest that superior performance of fundamental indexation could not be entirely attributed to value, size, or momentum effects. The outperformance persists even after adjusting for slightly higher transaction costs related to turnover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a concrete approach for the automatic mitigation of risks that are detected during process enactment. Given a process model exposed to risks, e.g. a financial process exposed to the risk of approval fraud, we enact this process and as soon as the likelihood of the associated risk(s) is no longer tolerable, we generate a set of possible mitigation actions to reduce the risks' likelihood, ideally annulling the risks altogether. A mitigation action is a sequence of controlled changes applied to the running process instance, taking into account a snapshot of the process resources and data, and the current status of the system in which the process is executed. These actions are proposed as recommendations to help process administrators mitigate process-related risks as soon as they arise. The approach has been implemented in the YAWL environment and its performance evaluated. The results show that it is possible to mitigate process-related risks within a few minutes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body computer control interfaces present new opportunities to engage children with games for learning. Stomp is a suite of educational games that use such a technology, allowing young children to use their whole body to interact with a digital environment projected on the floor. To maximise the effectiveness of this technology, tenets of self-determination theory (SDT) are applied to the design of Stomp experiences. By meeting user needs for competence, autonomy, and relatedness our aim is to increase children's engagement with the Stomp learning platform. Analysis of Stomp's design suggests that these tenets are met. Observations from a case study of Stomp being used by young children show that they were highly engaged and motivated by Stomp. This analysis demonstrates that continued application of SDT to Stomp will further enhance user engagement. It also is suggested that SDT, when applied more widely to other whole-body multi-user interfaces, could instil similar positive effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.