994 resultados para pedagogic model


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the identification of the gene family of kallikrein related peptidases (KLKs), their function has been robustly studied at the biochemical level. In vitro biochemical studies have shown that KLK proteases are involved in a number of extracellular processes that initiate intracellular signaling pathways by hydrolysis, as reviewed in Chapters 8, 9, and 15, Volume 1. These events have been associated with more invasive phenotypes of ovarian, prostate, and other cancers. Concomitantly, aberrant expression of KLKs has been associated with poor prognosis of patients with ovarian and prostate cancer (Borgoño and Diamandis, 2004; Clements et al., 2004; Yousef and Diamandis, 2009), with prostate-specific antigen (PSA, KLK3) being a long standing, clinically employed biomarker for prostate cancer (Lilja et al., 2008). Data generated from patient samples in clinical studies, alongwith biochemical activity, suggests that KLKs function in the development and progression of these diseases. To bridge the gap between their function at the molecular level and the clinical need for efficacious treatment and prognostic biomarkers, functional assessment at the in vitro cellular level, using various culture models, is increasing, particularly in a three-dimensional (3D) context (Abbott, 2003; Bissell and Radisky, 2001; Pampaloni et al., 2007; Yamada and Cukierman, 2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Capacity probability models of generating units are commonly used in many power system reliability studies, at hierarchical level one (HLI). Analytical modelling of a generating system with many units or generating units with many derated states in a system, can result in an extensive number of states in the capacity model. Limitations on available memory and computational time of present computer facilities can pose difficulties for assessment of such systems in many studies. A cluster procedure using the nearest centroid sorting method was used for IEEE-RTS load model. The application proved to be very effective in producing a highly similar model with substantially fewer states. This paper presents an extended application of the clustering method to include capacity probability representation. A series of sensitivity studies are illustrated using IEEE-RTS generating system and load models. The loss of load and energy expectations (LOLE, LOEE), are used as indicators to evaluate the application

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Self-efficacy has two cognitive components, efficacy expectations and outcome expectations, and their influence on behavior change is synergistic. Efficacy expectation is effected by four main sources of information provided by direct and indirect experiences. The four sources of information are performance accomplishments, vicarious experience, verbal persuasion and self-appraisal. How to measure and develop interventions is an important issue at present. This article clearly analyzes the relationship between variables of the self-efficacy model and explains the implementation of self-efficacy enhancing interventions and instruments in order to test the model. Through the process of the use of theory and feasibility in clinical practice, it is expected that professional medical care personnel should firstly familiarize themselves with the self-efficiency model and concept, and then flexibly promote it in professional fields clinical practice, chronic disease care and health promotion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wound research is a complex multidimensional activity most effectively conducted by inter-disciplinary teams that connect studies in basic wound biology, devices and biomaterials with clinical practice. These complexities have been recognised in a new initiative through the establishment of an inter-disciplinary wound research centre in Australia; the Wound Management Innovation Cooperative Research Centre (WMI CRC). The centre is funded by the Australian Government's Cooperative Research Centre Program and a consortium of 22 participants and has a resource of US$108 million over 8 years...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advanced programmatic risk analysis and management model (APRAM) is one of the recently developed methods that can be used for risk analysis and management purposes considering schedule, cost, and quality risks simultaneously. However, this model considers those failure risks that occur only over the design and construction phases of a project’s life cycle. While it can be sufficient for some projects for which the required cost during the operating life is much less than the budget required over the construction period, it should be modified in relation to infrastructure projects because the associated costs during the operating life cycle are significant. In this paper, a modified APRAM is proposed, which can consider potential risks that might occur over the entire life cycle of the project, including technical and managerial failure risks. Therefore, the modified model can be used as an efficient decision-support tool for construction managers in the housing industry in which various alternatives might be technically available. The modified method is demonstrated by using a real building project, and this demonstration shows that it can be employed efficiently by construction managers. The Delphi method was applied in order to figure out the failure events and their associated probabilities. The results show that although the initial cost of a cold-formed steel structural system is higher than a conventional construction system, the former’s failure cost is much lower than the latter’s

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All civil and private aircraft are required to comply with the airworthiness standards set by their national airworthiness authority and throughout their operational life must be in a condition of safe operation. Aviation accident data shows that over twenty percent of all fatal accidents in aviation are due to airworthiness issues, specifically aircraft mechanical failures. Ultimately it is the responsibility of each registered operator to ensure that their aircraft remain in a condition of safe operation, and this is done through both effective management of airworthiness activities and the effective program governance of safety outcomes. Typically, the projects within these airworthiness management programs are focused on acquiring, modifying and maintaining the aircraft as a capability supporting the business. Program governance provides the structure through which the goals and objectives of airworthiness programs are set along with the means of attaining them. Whilst the principal causes of failures in many programs can be traced to inadequate program governance, many of the failures in large scale projects can have their root causes in the organisational culture and more specifically in the organisational processes related to decision-making. This paper examines the primary theme of project and program based enterprises, and introduces a model for measuring organisational culture in airworthiness management programs using measures drawn from 211 respondents in Australian airline programs. The paper describes the theoretical perspectives applied to modifying an original model to specifically focus it on measuring the organisational culture of programs for managing airworthiness; identifying the most important factors needed to explain the relationship between the measures collected, and providing a description of the nature of these factors. The paper concludes by identifying a model that best describes the organisational culture data collected from seven airworthiness management programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An advanced rule-based Transit Signal Priority (TSP) control method is presented in this paper. An on-line transit travel time prediction model is the key component of the proposed method, which enables the selection of the most appropriate TSP plans for the prevailing traffic and transit condition. The new method also adopts a priority plan re-development feature that enables modifying or even switching the already implemented priority plan to accommodate changes in the traffic conditions. The proposed method utilizes conventional green extension and red truncation strategies and also two new strategies including green truncation and queue clearance. The new method is evaluated against a typical active TSP strategy and also the base case scenario assuming no TSP control in microsimulation. The evaluation results indicate that the proposed method can produce significant benefits in reducing the bus delay time and improving the service regularity with negligible adverse impacts on the non-transit street traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The field of Arts-Health practice and research has grown exponentially in the past 30 years. While researchers are using applied arts as the subject of investigation in research, the evaluation of practice and participant benefits has a limited general focus. In recent years, the field has witnessed a growing concentration on the evaluation of health outcomes, outputs and tangential benefits for participants engaging in Arts-Health practice. The wide range of methodological approaches applied arts practitioners implement make the field difficult to define. This article introduces the term Arts-Health intersections as a model of practice and framework to promote consistency in design, implementation and evaluative processes in applied arts programmes promoting health outcomes. The article challenges the current trend to solely evaluate health outcomes in the field, and promotes a concurrent and multidisciplinary methodological approach that can be adopted to promote evaluation, consistency and best practice in the field of Arts-Health intersections. The article provides a theoretical overview of Arts-Health intersections, and then takes this theoretical platform and details a best model of practice for developing Arts-Health intersections and presents this model as a guide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adolescent idiopathic scoliosis is a complex three dimensional deformity affecting 2-3% of the general population. The resulting spinal deformity consists of coronal curvature, hypokyphosis of the thoracic spine and vertebral rotation in the axial plane with posterior elements turned into the curve concavity. The potential for curve progression is heightened during the adolescent growth spurt. Success of scoliosis deformity correction depends on solid bony fusion between adjacent vertebrae after the intervertebral (IV) discs have been surgically cleared and the disc spaces filled with graft material. Recently a bioactive and resorbable scaffold fabricated from medical grade polycaprolactone has been developed for bone regeneration at load bearing sites. Combined with rhBMP-2, this has been shown to be successful in acting as a bone graft substitute in a porcine lumbar interbody fusion model when compared to autologous bone graft alone. The study aimed to establish a large animal thoracic spine interbody fusion model, develop spine biodegradable scaffolds (PCL) in combination with biologics (rhBMP-2) and to establish a platform for research into spine tissue engineering constructs. Preliminary results demonstrate higher grades of radiologically evident bony fusion across all levels when comparing fusion scores between the 3 and 6 month postop groups at the PCL CaP coated scaffold level, which is observed to be a similar grade to autograft, while no fusion is seen at the scaffold only level. Results to date suggest that the combination of rhBMP-2 and scaffold engineering actively promotes bone formation, laying the basis of a viable tissue engineered constructs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adolescent idiopathic scoliosis is a complex three dimensional deformity affecting 2-3% of the general population. Resulting spine deformities include progressive coronal curvature, hypokyphosis, or frank lordosis in the thoracic spine and vertebral rotation in the axial plane with posterior elements turned into the curve concavity. The potential for curve progression is heightened during the adolescent growth spurt. Success of scoliosis deformity correction depends on solid bony fusion between adjacent vertebrae after the intervertebral discs have been surgically cleared and the disc spaces filled with graft material. Problems with bone graft harvest site morbidity as well as limited bone availability have led to the search for bone graft substitutes. Recently, a bioactive and resorbable scaffold fabricated from medical grade polycaprolactone (PCL) has been developed for bone regeneration at load bearing sites. Combined with recombinant human bone morphogenic protein–2 (rhBMP-2), this has been shown to be successful in acting as a bone graft substitute in acting as a bone graft substitute in a porcine lumbar interbody fusion model when compared to autologous bone graft. This in vivo sheep study intends to evaluate the suitability of a custom designed medical grade PCL scaffold in combination with rhBMP-2 as a bone graft substitute in the setting of mini–thoracotomy surgery as a platform for ongoing research to benefit patients with adolescent idiopathic scoliosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the use of property graphs for mapping data between AEC software tools, which are not linked by common data formats and/or other interoperability measures. The intention of introducing this in practice, education and research is to facilitate the use of diverse, non-integrated design and analysis applications by a variety of users who need to create customised digital workflows, including those who are not expert programmers. Data model types are examined by way of supporting the choice of directed, attributed, multi-relational graphs for such data transformation tasks. A brief exemplar design scenario is also presented to illustrate the concepts and methods proposed, and conclusions are drawn regarding the feasibility of this approach and directions for further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and reflecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load modeling plays an important role in power system dynamic stability assessment. One of the widely used methods in assessing load model impact on system dynamic response is through parametric sensitivity analysis. Load ranking provides an effective measure of such impact. Traditionally, load ranking is based on either static or dynamic load model alone. In this paper, composite load model based load ranking framework is proposed. It enables comprehensive investigation into load modeling impacts on system stability considering the dynamic interactions between load and system dynamics. The impact of load composition on the overall sensitivity and therefore on ranking of the load is also investigated. Dynamic simulations are performed to further elucidate the results obtained through sensitivity based load ranking approach.