990 resultados para Meta-modelling
Resumo:
All over the world, organizations are becoming more and more complex, and there’s a need to capture its complexity, so this is when the DEMO methodology (Design and Engineering Methodology for Organizations), created and developed by Jan L. G. Dietz, reaches its potential, which is to capture the structure of business processes in a coherent and consistent form of diagrams with their respective grammatical rules. The creation of WAMM (Wiki Aided Meta Modeling) platform was the main focus of this thesis, and had like principal precursor the idea to create a Meta-Editor that supports semantic data and uses MediaWiki. This prototype Meta-Editor uses MediaWiki as a receptor of data, and uses the ideas created in the Universal Enterprise Adaptive Object Model and the concept of Semantic Web, to create a platform that suits our needs, through Semantic MediaWiki, which helps the computer interconnect information and people in a more comprehensive, giving meaning to the content of the pages. The proposed Meta-Modeling platform allows the specification of the abstract syntax i.e., the grammar, and concrete syntax, e.g., symbols and connectors, of any language, as well as their model types and diagram types. We use the DEMO language as a proofof-concept and example. All such specifications are done in a coherent and formal way by the creation of semantic wiki pages and semantic properties connecting them.
Resumo:
BACKGROUND: Published individual-based, dynamic sexual network modelling studies reach different conclusions about the population impact of screening for Chlamydia trachomatis. The objective of this study was to conduct a direct comparison of the effect of organised chlamydia screening in different models. METHODS: Three models simulating population-level sexual behaviour, chlamydia transmission, screening and partner notification were used. Parameters describing a hypothetical annual opportunistic screening program in 16-24 year olds were standardised, whereas other parameters from the three original studies were retained. Model predictions of the change in chlamydia prevalence were compared under a range of scenarios. RESULTS: Initial overall chlamydia prevalence rates were similar in women but not men and there were age and sex-specific differences between models. The number of screening tests carried out was comparable in all models but there were large differences in the predicted impact of screening. After 10 years of screening, the predicted reduction in chlamydia prevalence in women aged 16-44 years ranged from 4% to 85%. Screening men and women had a greater impact than screening women alone in all models. There were marked differences between models in assumptions about treatment seeking and sexual behaviour before the start of the screening intervention. CONCLUSIONS: Future models of chlamydia transmission should be fitted to both incidence and prevalence data. This meta-modelling study provides essential information for explaining differences between published studies and increasing the utility of individual-based chlamydia transmission models for policy making.
Resumo:
La recherche en génie logiciel a depuis longtemps tenté de mieux comprendre le processus de développement logiciel, minimalement, pour en reproduire les bonnes pratiques, et idéalement, pour pouvoir le mécaniser. On peut identifier deux approches majeures pour caractériser le processus. La première approche, dite transformationnelle, perçoit le processus comme une séquence de transformations préservant certaines propriétés des données à l’entrée. Cette idée a été récemment reprise par l’architecture dirigée par les modèles de l’OMG. La deuxième approche consiste à répertorier et à codifier des solutions éprouvées à des problèmes récurrents. Les recherches sur les styles architecturaux, les patrons de conception, ou les cadres d’applications s’inscrivent dans cette approche. Notre travail de recherche reconnaît la complémentarité des deux approches, notamment pour l’étape de conception: dans le cadre du développement dirigé par les modèles, nous percevons l’étape de conception comme l’application de patrons de solutions aux modèles reçus en entrée. Il est coutume de définir l’étape de conception en termes de conception architecturale, et conception détaillée. La conception architecturale se préoccupe d’organiser un logiciel en composants répondant à un ensemble d’exigences non-fonctionnelles, alors que la conception détaillée se préoccupe, en quelque sorte, du contenu de ces composants. La conception architecturale s’appuie sur des styles architecturaux qui sont des principes d’organisation permettant d’optimiser certaines qualités, alors que la conception détaillée s’appuie sur des patrons de conception pour attribuer les responsabilités aux classes. Les styles architecturaux et les patrons de conception sont des artefacts qui codifient des solutions éprouvées à des problèmes récurrents de conception. Alors que ces artefacts sont bien documentés, la décision de les appliquer reste essentiellement manuelle. De plus, les outils proposés n’offrent pas un support adéquat pour les appliquer à des modèles existants. Dans cette thèse, nous nous attaquons à la conception détaillée, et plus particulièrement, à la transformation de modèles par application de patrons de conception, en partie parce que les patrons de conception sont moins complexes, et en partie parce que l’implémentation des styles architecturaux passe souvent par les patrons de conception. Ainsi, nous proposons une approche pour représenter et appliquer les patrons de conception. Notre approche se base sur la représentation explicite des problèmes résolus par ces patrons. En effet, la représentation explicite du problème résolu par un patron permet : (1) de mieux comprendre le patron, (2) de reconnaître l’opportunité d’appliquer le patron en détectant une instance de la représentation du problème dans les modèles du système considéré, et (3) d’automatiser l’application du patron en la représentant, de façon déclarative, par une transformation d’une instance du problème en une instance de la solution. Pour vérifier et valider notre approche, nous l’avons utilisée pour représenter et appliquer différents patrons de conception et nous avons effectué des tests pratiques sur des modèles générés à partir de logiciels libres.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Ecological problems are typically multi faceted and need to be addressed from a scientific and a management perspective. There is a wealth of modelling and simulation software available, each designed to address a particular aspect of the issue of concern. Choosing the appropriate tool, making sense of the disparate outputs, and taking decisions when little or no empirical data is available, are everyday challenges facing the ecologist and environmental manager. Bayesian Networks provide a statistical modelling framework that enables analysis and integration of information in its own right as well as integration of a variety of models addressing different aspects of a common overall problem. There has been increased interest in the use of BNs to model environmental systems and issues of concern. However, the development of more sophisticated BNs, utilising dynamic and object oriented (OO) features, is still at the frontier of ecological research. Such features are particularly appealing in an ecological context, since the underlying facts are often spatial and temporal in nature. This thesis focuses on an integrated BN approach which facilitates OO modelling. Our research devises a new heuristic method, the Iterative Bayesian Network Development Cycle (IBNDC), for the development of BN models within a multi-field and multi-expert context. Expert elicitation is a popular method used to quantify BNs when data is sparse, but expert knowledge is abundant. The resulting BNs need to be substantiated and validated taking this uncertainty into account. Our research demonstrates the application of the IBNDC approach to support these aspects of BN modelling. The complex nature of environmental issues makes them ideal case studies for the proposed integrated approach to modelling. Moreover, they lend themselves to a series of integrated sub-networks describing different scientific components, combining scientific and management perspectives, or pooling similar contributions developed in different locations by different research groups. In southern Africa the two largest free-ranging cheetah (Acinonyx jubatus) populations are in Namibia and Botswana, where the majority of cheetahs are located outside protected areas. Consequently, cheetah conservation in these two countries is focussed primarily on the free-ranging populations as well as the mitigation of conflict between humans and cheetahs. In contrast, in neighbouring South Africa, the majority of cheetahs are found in fenced reserves. Nonetheless, conflict between humans and cheetahs remains an issue here. Conservation effort in South Africa is also focussed on managing the geographically isolated cheetah populations as one large meta-population. Relocation is one option among a suite of tools used to resolve human-cheetah conflict in southern Africa. Successfully relocating captured problem cheetahs, and maintaining a viable free-ranging cheetah population, are two environmental issues in cheetah conservation forming the first case study in this thesis. The second case study involves the initiation of blooms of Lyngbya majuscula, a blue-green algae, in Deception Bay, Australia. L. majuscula is a toxic algal bloom which has severe health, ecological and economic impacts on the community located in the vicinity of this algal bloom. Deception Bay is an important tourist destination with its proximity to Brisbane, Australia’s third largest city. Lyngbya is one of several algae considered to be a Harmful Algal Bloom (HAB). This group of algae includes other widespread blooms such as red tides. The occurrence of Lyngbya blooms is not a local phenomenon, but blooms of this toxic weed occur in coastal waters worldwide. With the increase in frequency and extent of these HAB blooms, it is important to gain a better understanding of the underlying factors contributing to the initiation and sustenance of these blooms. This knowledge will contribute to better management practices and the identification of those management actions which could prevent or diminish the severity of these blooms.
Resumo:
The paper introduces the underlying principles and the general features of a meta-method (MAP method) developed as part of and used in various research, education and professional development programmes at ESC Lille. This method aims at providing effective and efficient structure and process for acting and learning in various complex, uncertain and ambiguous managerial situations (projects, programmes, portfolios). The paper is developed around three main parts. First, I suggest revisiting the dominant vision of the project management knowledge field, based on the assumptions they are not addressing adequately current business and management contexts and situations, and that competencies in management of entrepreneurial activities are the sources of creation of value for organisations. Then, grounded on the former developments, I introduce the underlying concepts supporting MAP method seen as a ‘convention generator’ and how this meta method inextricably links learning and practice in addressing managerial situations. Finally, I briefly describe an example of application, illustrating with a case study how the method integrates Project Management Governance, and give few examples of use in Management Education and Professional Development.
Resumo:
This paper presents an approach to modelling the resilience of a generic (potable) water supply system. The system is contextualized as a meta-system consisting of three subsystems to represent the natural catchment, the water treatment plant and the water distribution infrastructure for urban use. An abstract mathematical model of the meta-system is disaggregated progressively to form a cascade of equations forming a relational matrix of models. This allows the investigation of commonly implicit relationships between various operational components within the meta system, the in-depth understanding of specific system components and influential factors and the incorporation of explicit disturbances to explore system behaviour. Consequently, this will facilitate long-term decision making to achieve sustainable solutions for issues such as, meeting a growing demand or managing supply-side influences in the meta-system under diverse water availability regimes. This approach is based on the hypothesis that the means to achieve resilient supply of water may be better managed by modelling the effects of changes at specific levels that have a direct or in some cases indirect impact on higher-order outcomes. Additionally, the proposed strategy allows the definition of approaches to combine disparate data sets to synthesise previously missing or incomplete higher-order information, a scientifically robust means to define and carry out meta-analyses using knowledge from diverse yet relatable disciplines relevant to different levels of the system and for enhancing the understanding of dependencies and inter-dependencies of variable factors at various levels across the meta-system. The proposed concept introduces an approach for modelling a complex infrastructure system as a meta system which consists of a combination of bio-ecological, technical and socio-technical subsystems.
Resumo:
Background Although the detrimental impact of major depressive disorder (MDD) at the individual level has been described, its global epidemiology remains unclear given limitations in the data. Here we present the modelled epidemiological profile of MDD dealing with heterogeneity in the data, enforcing internal consistency between epidemiological parameters and making estimates for world regions with no empirical data. These estimates were used to quantify the burden of MDD for the Global Burden of Disease Study 2010 (GBD 2010). Method Analyses drew on data from our existing literature review of the epidemiology of MDD. DisMod-MR, the latest version of the generic disease modelling system redesigned as a Bayesian meta-regression tool, derived prevalence by age, year and sex for 21 regions. Prior epidemiological knowledge, study- and country-level covariates adjusted sub-optimal raw data. Results There were over 298 million cases of MDD globally at any point in time in 2010, with the highest proportion of cases occurring between 25 and 34 years. Global point prevalence was very similar across time (4.4% (95% uncertainty: 4.2–4.7%) in 1990, 4.4% (4.1–4.7%) in 2005 and 2010), but higher in females (5.5% (5.0–6.0%) compared to males (3.2% (3.0–3.6%) in 2010. Regions in conflict had higher prevalence than those with no conflict. The annual incidence of an episode of MDD followed a similar age and regional pattern to prevalence but was about one and a half times higher, consistent with an average duration of 37.7 weeks. Conclusion We were able to integrate available data, including those from high quality surveys and sub-optimal studies, into a model adjusting for known methodological sources of heterogeneity. We were also able to estimate the epidemiology of MDD in regions with no available data. This informed GBD 2010 and the public health field, with a clearer understanding of the global distribution of MDD.
Resumo:
This paper describes students’ developing meta-representational competence, drawn from the second phase of a longitudinal study, Transforming Children’s Mathematical and Scientific Development. A group of 21 highly able Grade 1 students was engaged in mathematics/science investigations as part of a data modelling program. A pedagogical approach focused on students’ interpretation of categorical and continuous data was implemented through researcher-directed weekly sessions over a 2-year period. Fine-grained analysis of the developmental features and explanations of their graphs showed that explicit pedagogical attention to conceptual differences between categorical and continuous data was critical to development of inferential reasoning.
Resumo:
IntroductionAutomated weaning systems may improve adaptation of mechanical support for a patient’s ventilatory needs and facilitate systematic and early recognition of their ability to breathe spontaneously and the potential for discontinuation of ventilation. Our objective was to compare mechanical ventilator weaning duration for critically ill adults and children when managed with automated systems versus non-automated strategies. Secondary objectives were to determine differences in duration of ventilation, intensive care unit (ICU) and hospital length of stay (LOS), mortality, and adverse events.MethodsElectronic databases were searched to 30 September 2013 without language restrictions. We also searched conference proceedings; trial registration websites; and article reference lists. Two authors independently extracted data and assessed risk of bias. We combined data using random-effects modelling.ResultsWe identified 21 eligible trials totalling 1,676 participants. Pooled data from 16 trials indicated that automated systems reduced the geometric mean weaning duration by 30% (95% confidence interval (CI) 13% to 45%), with substantial heterogeneity (I2 = 87%, P <0.00001). Reduced weaning duration was found with mixed or medical ICU populations (42%, 95% CI 10% to 63%) and Smartcare/PS™ (28%, 95% CI 7% to 49%) but not with surgical populations or using other systems. Automated systems reduced ventilation duration with no heterogeneity (10%, 95% CI 3% to 16%) and ICU LOS (8%, 95% CI 0% to 15%). There was no strong evidence of effect on mortality, hospital LOS, reintubation, self-extubation and non-invasive ventilation following extubation. Automated systems reduced prolonged mechanical ventilation and tracheostomy. Overall quality of evidence was high.ConclusionsAutomated systems may reduce weaning and ventilation duration and ICU stay. Due to substantial trial heterogeneity an adequately powered, high quality, multi-centre randomized controlled trial is needed.
Resumo:
One of the main purposes of building a battery model is for monitoring and control during battery charging/discharging as well as for estimating key factors of batteries such as the state of charge for electric vehicles. However, the model based on the electrochemical reactions within the batteries is highly complex and difficult to compute using conventional approaches. Radial basis function (RBF) neural networks have been widely used to model complex systems for estimation and control purpose, while the optimization of both the linear and non-linear parameters in the RBF model remains a key issue. A recently proposed meta-heuristic algorithm named Teaching-Learning-Based Optimization (TLBO) is free of presetting algorithm parameters and performs well in non-linear optimization. In this paper, a novel self-learning TLBO based RBF model is proposed for modelling electric vehicle batteries using RBF neural networks. The modelling approach has been applied to two battery testing data sets and compared with some other RBF based battery models, the training and validation results confirm the efficacy of the proposed method.
Resumo:
BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.
METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.
CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.