316 resultados para component model of criteria systems
Resumo:
Background The purpose of this study was to identify candidate metastasis suppressor genes from a mouse allograft model of prostate cancer (NE-10). This allograft model originally developed metastases by twelve weeks after implantation in male athymic nude mice, but lost the ability to metastasize after a number of in vivo passages. We performed high resolution array comparative genomic hybridization on the metastasizing and non-metastasizing allografts to identify chromosome imbalances that differed between the two groups of tumors. Results This analysis uncovered a deletion on chromosome 2 that differed between the metastasizing and non-metastasizing tumors. Bioinformatics filters were employed to mine this region of the genome for candidate metastasis suppressor genes. Of the 146 known genes that reside within the region of interest on mouse chromosome 2, four candidate metastasis suppressor genes (Slc27a2, Mall, Snrpb, and Rassf2) were identified. Quantitative expression analysis confirmed decreased expression of these genes in the metastasizing compared to non-metastasizing tumors. Conclusion This study presents combined genomics and bioinformatics approaches for identifying potential metastasis suppressor genes. The genes identified here are candidates for further studies to determine their functional role in inhibiting metastases in the NE-10 allograft model and human prostate cancer.
Resumo:
This work is focussed on developing a commissioning procedure so that a Monte Carlo model, which uses BEAMnrc’s standard VARMLC component module, can be adapted to match a specific BrainLAB m3 micro-multileaf collimator (μMLC). A set of measurements are recommended, for use as a reference against which the model can be tested and optimised. These include radiochromic film measurements of dose from small and offset fields, as well as measurements of μMLC transmission and interleaf leakage. Simulations and measurements to obtain μMLC scatter factors are shown to be insensitive to relevant model parameters and are therefore not recommended, unless the output of the linear accelerator model is in doubt. Ultimately, this note provides detailed instructions for those intending to optimise a VARMLC model to match the dose delivered by their local BrainLAB m3 μMLC device.
Resumo:
Information Systems researchers have employed a diversity of sometimes inconsistent measures of IS success, seldom explicating the rationale, thereby complicating the choice for future researchers. In response to these and other issues, Gable, Sedera and Chan introduced the IS-Impact measurement model. This model represents “the stream of net benefits from the Information System (IS), to date and anticipated, as perceived by all key-user-groups”. Although the IS-Impact model was rigorously validated in previous research, there is a need to further generalise and validate it in different context. This paper reported the findings of the IS-Impact model revalidation study at four state governments in Malaysia with 232 users of a financial system that is currently being used at eleven state governments in Malaysia. Data was analysed following the guidelines for formative measurement validation using SmartPLS. Based on the PLS results, data supported the IS-Impact dimensions and measures thus confirming the validity of the IS-Impact model in Malaysia. This indicates that the IS-Impact model is robust and can be used across different context.
Resumo:
The field of literacy studies has always been challenged by the changing technologies that humans have used to express, represent and communicate their feelings, ideas, understandings and knowledge. However, while the written word has remained central to literacy processes over a long period, it is generally accepted that there have been significant changes to what constitutes ‘literate’ practice. In particular, the status of the printed word has been challenged by the increasing dominance of the image, along with the multimodal meaning-making systems facilitated by digital media. For example, Gunther Kress and other members of the New London Group have argued that the second half of the twentieth century saw a significant cultural shift from the linguistic to the visual as the dominant semiotic mode. This in turn, they suggest, was accompanied by a cultural shift ‘from page to screen’ as a dominant space of representation (e.g. Cope & Kalantzis, 2000; Kress, 2003; New London Group, 1996). In a similar vein, Bill Green has noted that we have witnessed a shift from the regime of the print apparatus to a regime of the digital electronic apparatus (Lankshear, Snyder and Green, 2000). For these reasons, the field of literacy education has been challenged to find new ways to conceptualise what is meant by ‘literacy’ in the twenty first century and to rethink the conditions under which children might best be taught to be fully literate so that they can operate with agency in today’s world.
Resumo:
Micro-finance, which includes micro-credit as one of its core services, has become an important component of a range of business models – from those that operate on a strictly economic basis to those that come from a philanthropic base, through Non Government Organisations (NGOs). Its success is often measured by the number of loans issued, their size, and the repayment rates. This paper has a dual purpose: to identify whether the models currently used to deliver micro-credit services to the poor are socially responsible and to suggest a new model of delivery that addresses some of the social responsibility issues, while supporting community development. The proposed model is currently being implemented in Beira, the second largest city in Mozambique. Mozambique exhibits many of the characteristics found in other African countries so the model, if successful, may have implications for other poor African nations as well as other developing economies.
Resumo:
A model to predict the buildup of mainly traffic-generated volatile organic compounds or VOCs (toluene, ethylbenzene, ortho-xylene, meta-xylene, and para-xylene) on urban road surfaces is presented. The model required three traffic parameters, namely average daily traffic (ADT), volume to capacity ratio (V/C), and surface texture depth (STD), and two chemical parameters, namely total suspended solid (TSS) and total organic carbon (TOC), as predictor variables. Principal component analysis and two phase factor analysis were performed to characterize the model calibration parameters. Traffic congestion was found to be the underlying cause of traffic-related VOC buildup on urban roads. The model calibration was optimized using orthogonal experimental design. Partial least squares regression was used for model prediction. It was found that a better optimized orthogonal design could be achieved by including the latent factors of the data matrix into the design. The model performed fairly accurately for three different land uses as well as five different particle size fractions. The relative prediction errors were 10–40% for the different size fractions and 28–40% for the different land uses while the coefficients of variation of the predicted intersite VOC concentrations were in the range of 25–45% for the different size fractions. Considering the sizes of the data matrices, these coefficients of variation were within the acceptable interlaboratory range for analytes at ppb concentration levels.
Resumo:
A basic tenet of ecological economics is that economic growth and development are ultimately constrained by environmental carrying capacities. It is from this basis that notions of a sustainable economy and of sustainable economic development emerge to undergird the “standard model” of ecological economics. However, the belief in “hard” environmental constraints may be obscuring the important role of the entrepreneur in the co-evolution of economic and environmental relations, and hence limiting or distorting the analytic focus of ecological economics and the range of policy options that are considered for sustainable economic development. This paper outlines a co-evolutionary model of the dynamics of economic and ecological systems as connected by entrepreneurial behaviour. We then discuss some of the key analytic and policy implications.
Resumo:
In this study we propose a virtual index for measuring the relative innovativeness of countries. Using a multistage virtual benchmarking process, the best and rational benchmark is extracted for inefficient ISs. Furthermore, Tobit and Ordinary Least Squares (OLS) regression models are used to investigate the likelihood of changes in inefficiencies by investigating country-specific factors. The empirical results relating to the virtual benchmarking process suggest that the OLS regression model would better explain changes in the performance of innovation- inefficient countries.
Resumo:
Biochemical reactions underlying genetic regulation are often modelled as a continuous-time, discrete-state, Markov process, and the evolution of the associated probability density is described by the so-called chemical master equation (CME). However the CME is typically difficult to solve, since the state-space involved can be very large or even countably infinite. Recently a finite state projection method (FSP) that truncates the state-space was suggested and shown to be effective in an example of a model of the Pap-pili epigenetic switch. However in this example, both the model and the final time at which the solution was computed, were relatively small. Presented here is a Krylov FSP algorithm based on a combination of state-space truncation and inexact matrix-vector product routines. This allows larger-scale models to be studied and solutions for larger final times to be computed in a realistic execution time. Additionally the new method computes the solution at intermediate times at virtually no extra cost, since it is derived from Krylov-type methods for computing matrix exponentials. For the purpose of comparison the new algorithm is applied to the model of the Pap-pili epigenetic switch, where the original FSP was first demonstrated. Also the method is applied to a more sophisticated model of regulated transcription. Numerical results indicate that the new approach is significantly faster and extendable to larger biological models.
Resumo:
Society faces an unprecedented global education challenge to equip professionals with the knowledge and skills to address emerging 21st Century challenges, spanning climate change mitigation through to adaptation measures to deal with issues such as temperature and sea level rise, and diminishing fresh water and fossil fuel reserves. This paper discusses the potential for systemic and synergistic integration of curriculum with campus operations to accelerate curriculum renewal towards ESD, drawing on the authors' experiences within engineering education. The paper begins by a providing a brief overview of the need for timely curriculum renewal towards ESD in tertiary education. The paper then highlights some examples of academic barriers that need to be overcome for integration efforts to be successful, and opportunities for promoting the benefits of such integration. The paper concludes by discussing the rational for planning green campus initiatives within a larger system of curriculum renewal considerations, including awareness raising and developing a common understanding, identifying and mapping graduate attributes, curriculum auditing, content development and strategic renewal, and bridging and outreach.
Resumo:
This chapter proposes a conceptual model for optimal development of needed capabilities for the contemporary knowledge economy. We commence by outlining key capability requirements of the 21st century knowledge economy, distinguishing these from those suited to the earlier stages of the knowledge economy. We then discuss the extent to which higher education currently caters to these requirements and then put forward a new model for effective knowledge economy capability learning. The core of this model is the development of an adaptive and adaptable career identity, which is created through a reflective process of career self-management, drawing upon data from the self and the world of work. In turn, career identity drives the individual’s process of skill and knowledge acquisition, including deep disciplinary knowledge. The professional capability learning thus acquired includes disciplinary skill and knowledge sets, generic skills, and also skills for the knowledge economy, including disciplinary agility, social network capability, and enterprise skills. In the final part of this chapter, we envision higher education systems that embrace the model, and suggest steps that could be taken toward making the development of knowledge economy capabilities an integral part of the university experience.
Resumo:
The authors present a Cause-Effect fault diagnosis model, which utilises the Root Cause Analysis approach and takes into account the technical features of a digital substation. The Dempster/Shafer evidence theory is used to integrate different types of fault information in the diagnosis model so as to implement a hierarchical, systematic and comprehensive diagnosis based on the logic relationship between the parent and child nodes such as transformer/circuit-breaker/transmission-line, and between the root and child causes. A real fault scenario is investigated in the case study to demonstrate the developed approach in diagnosing malfunction of protective relays and/or circuit breakers, miss or false alarms, and other commonly encountered faults at a modern digital substation.
Resumo:
Adopting a model of job enrichment we report on a longitudinal case investigating the perceived impact of an Enterprise Resource Planning (ERP) system on user job design characteristics. Our results indicated that in the context of an ERP geared towards centralisation and standardisation the extent to which users perceived an increase or decrease in job enrichment was associated with aspects such as formal authority and the nature of their work role. Experienced operational employees proficient in the original legacy system perceived ERP system protocols to constrain their actions, limit training and increase dependence on others in the workflow. Conversely, managerial users reported a number of benefits relating to report availability, improved organisational transparency and increased overall job enrichment. These results supported our argument concerning the relationship between ERPs with a standardisation intent and positive job enrichment outcomes for managerial users and negative job-related outcomes for operational users.
Resumo:
Business process management (BPM) is becoming the dominant management paradigm. Business process modelling is central to BPM, and the resultant business process model the core artefact guiding subsequent process change. Thus, model quality is at the centre, mediating between the modelling effort and related growing investment in ultimate process improvements. Nonetheless, though research interest in the properties that differentiate high quality process models is longstanding, there have been no past reports of a valid, operationalised, holistic measure of business process model quality. In attention to this gap, this paper reports validation of a Business Process Model Quality measurement model, conceptualised as a single-order, formative index. Such a measurement model has value as the dependent variable in rigorously researching the drivers of model quality; as antecedent of ultimate process improvements; and potentially as an economical comparator and diagnostic for practice.