41 resultados para conceptual data modelling
Resumo:
This article reconceptualizes shared rule and uses novel data to measure it, thus addressing two shortcomings of federal literature. First, while most studies focus on self-rule, one question that is largely neglected is how lower-level governments can influence politics at a higher level in the absence of “second” chambers. The answer is through shared rule. A second shortcoming is that even when addressing this question, scholars concentrate on constitutional-administrative aspects of vertical intergovernmentalism, neglecting more informal, “political” dynamics. Comparing the twenty-six Swiss cantons allows drawing two lessons for federal studies: That shared rule is multifaceted and complex, and that to study informal territorial actors as well as direct political processes is indispensable to understand how power is actually distributed in federal political systems.
Resumo:
BACKGROUND Pathogenic bacteria are often asymptomatically carried in the nasopharynx. Bacterial carriage can be reduced by vaccination and has been used as an alternative endpoint to clinical disease in randomised controlled trials (RCTs). Vaccine efficacy (VE) is usually calculated as 1 minus a measure of effect. Estimates of vaccine efficacy from cross-sectional carriage data collected in RCTs are usually based on prevalence odds ratios (PORs) and prevalence ratios (PRs), but it is unclear when these should be measured. METHODS We developed dynamic compartmental transmission models simulating RCTs of a vaccine against a carried pathogen to investigate how VE can best be estimated from cross-sectional carriage data, at which time carriage should optimally be assessed, and to which factors this timing is most sensitive. In the models, vaccine could change carriage acquisition and clearance rates (leaky vaccine); values for these effects were explicitly defined (facq, 1/fdur). POR and PR were calculated from model outputs. Models differed in infection source: other participants or external sources unaffected by the trial. Simulations using multiple vaccine doses were compared to empirical data. RESULTS The combined VE against acquisition and duration calculated using POR (VEˆacq.dur, (1-POR)×100) best estimates the true VE (VEacq.dur, (1-facq×fdur)×100) for leaky vaccines in most scenarios. The mean duration of carriage was the most important factor determining the time until VEˆacq.dur first approximates VEacq.dur: if the mean duration of carriage is 1-1.5 months, up to 4 months are needed; if the mean duration is 2-3 months, up to 8 months are needed. Minor differences were seen between models with different infection sources. In RCTs with shorter intervals between vaccine doses it takes longer after the last dose until VEˆacq.dur approximates VEacq.dur. CONCLUSION The timing of sample collection should be considered when interpreting vaccine efficacy against bacterial carriage measured in RCTs.
Resumo:
This study investigated the empirical differentiation of prospective memory, executive functions, and metacognition and their structural relationships in 119 elementary school children (M = 95 months, SD = 4.8 months). These cognitive abilities share many characteristics on the theoretical level and are all highly relevant in many everyday contexts when intentions must be executed. Nevertheless, their empirical relationships have not been examined on the latent level, although an empirical approach would contribute to our knowledge concerning the differentiation of cognitive abilities during childhood. We administered a computerized event-based prospective memory task, three executive function tasks (updating, inhibition, shifting), and a metacognitive control task in the context of spelling. Confirmatory factor analysis revealed that the three cognitive abilities are already empirically differentiable in young elementary school children. At the same time, prospective memory and executive functions were found to be strongly related, and there was also a close link between prospective memory and metacognitive control. Furthermore, executive functions and metacognitive control were marginally significantly related. The findings are discussed within a framework of developmental differentiation and conceptual similarities and differences.
Resumo:
OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.
Resumo:
The presented approach describes a model for a rule-based expert system calculating the temporal variability of the release of wet snow avalanches, using the assumption of avalanche triggering without the loading of new snow. The knowledge base of the model is created by using investigations on the system behaviour of wet snow avalanches in the Italian Ortles Alps, and is represented by a fuzzy logic rule-base. Input parameters of the expert system are numerical and linguistic variables, measurable meteorological and topographical factors and observable characteristics of the snow cover. Output of the inference method is the quantified release disposition for wet snow avalanches. Combining topographical parameters and the spatial interpolation of the calculated release disposition a hazard index map is dynamically generated. Furthermore, the spatial and temporal variability of damage potential on roads exposed to wet snow avalanches can be quantified, expressed by the number of persons at risk. The application of the rule base to the available data in the study area generated plausible results. The study demonstrates the potential for the application of expert systems and fuzzy logic in the field of natural hazard monitoring and risk management.
Resumo:
The fatality risk caused by avalanches on road networks can be analysed using a long-term approach, resulting in a mean value of risk, and with emphasis on short-term fluctuations due to the temporal variability of both, the hazard potential and the damage potential. In this study, the approach for analysing the long-term fatality risk has been adapted by modelling the highly variable short-term risk. The emphasis was on the temporal variability of the damage potential and the related risk peaks. For defined hazard scenarios resulting from classified amounts of snow accumulation, the fatality risk was calculated by modelling the hazard potential and observing the traffic volume. The avalanche occurrence probability was calculated using a statistical relationship between new snow height and observed avalanche releases. The number of persons at risk was determined from the recorded traffic density. The method resulted in a value for the fatality risk within the observed time frame for the studied road segment. The long-term fatality risk due to snow avalanches as well as the short-term fatality risk was compared to the average fatality risk due to traffic accidents. The application of the method had shown that the long-term avalanche risk is lower than the fatality risk due to traffic accidents. The analyses of short-term avalanche-induced fatality risk provided risk peaks that were 50 times higher than the statistical accident risk. Apart from situations with high hazard level and high traffic density, risk peaks result from both, a high hazard level combined with a low traffic density and a high traffic density combined with a low hazard level. This provided evidence for the importance of the temporal variability of the damage potential for risk simulations on road networks. The assumed dependence of the risk calculation on the sum of precipitation within three days is a simplified model. Thus, further research is needed for an improved determination of the diurnal avalanche probability. Nevertheless, the presented approach may contribute as a conceptual step towards a risk-based decision-making in risk management.
Resumo:
Adaptation potential of forests to rapid climatic changes can be assessed from vegetation dynamics during past climatic changes as preserved in fossil pollen data. However, pollen data reflect the integrated effects of climate and biotic processes, such as establishment, survival, competition, and migration. To disentangle these processes, we compared an annually laminated late Würm and Holocene pollen record from the Central Swiss Plateau with simulations of a dynamic forest patch model. All input data used in the simulations were largely independent from pollen data; i.e. the presented analysis is non-circular. Temperature and precipitation scenarios were based on reconstructions from pollen-independent sources. The earliest arrival times of the species at the study site after the last glacial were inferred from pollen maps. We ran a series of simulations under different combinations of climate and immigration scenarios. In addition, the sensitivity of the simulated presence/absence of four major species to changes in the climate scenario was examined. The pattern of the pollen record could partly be explained by the used climate scenario, mostly by temperature. However, some features, in particular the absence of most species during the late Würm could only be simulated if the winter temperature anomalies of the used scenario were decreased considerably. Consequently, we had to assume in the simulations, that most species immigrated during or after the Younger Dryas (12 000 years BP), Abies and Fagus even later. Given the timing of tree species immigration, the vegetation was in equilibrium with climate during long periods, but responded with lags at the time-scale of centuries to millennia caused by a secondary succession after rapid climatic changes such as at the end of Younger Dryas, or immigration of dominant taxa. Climate influenced the tree taxa both directly and indirectly by changing inter-specific competition. We concluded, that also during the present fast climatic change, species migration might be an important process, particularly if geographic barriers, such as the Alps are in the migrational path.
Resumo:
It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.
Resumo:
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.
Resumo:
Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.
Resumo:
Synopsis: Sport organisations are facing multiple challenges originating from an increasingly complex and dynamic environment in general, and from internal changes in particular. Our study seeks to reveal and analyse the causes for professionalization processes in international sport federations, the forms resulting from it, as well as related consequences. Abstract: AIM OF ABSTRACT/PAPER - RESEARCH QUESTION Sport organisations are facing multiple challenges originating from an increasingly complex and dynamic environment in general, and from internal changes in particular. In this context, professionalization seems to have been adopted by sport organisations as an appropriate strategy to respond to pressures such as becoming more “business-like”. The ongoing study seeks to reveal and analyse the internal and external causes for professionalization processes in international sport federations, the forms resulting from it (e.g. organisational, managerial, economic) as well as related consequences on objectives, values, governance methods, performance management or again rationalisation. THEORETICAL BACKGROUND/LITERATURE REVIEW Studies on sport as specific non-profit sector mainly focus on the prospect of the “professionalization of individuals” (Thibault, Slack & Hinings, 1991), often within sport clubs (Thiel, Meier & Cachay, 2006) and national sport federations (Seippel, 2002) or on organisational change (Griginov & Sandanski, 2008; Slack & Hinings, 1987, 1992; Slack, 1985, 2001), thus leaving broader analysis on governance, management and professionalization in sport organisations an unaccomplished task. In order to further current research on above-mentioned topics, our intention is to analyse causes, forms and consequences of professionalisation processes in international sport federations. The social theory of action (Coleman, 1986; Esser, 1993) has been defined as appropriate theoretical framework, deriving in the following a multi-level framework for the analysis of sport organisations (Nagel, 2007). In light of the multi-level framework, sport federations are conceptualised as corporative actors whose objectives are defined and implemented with regard to the interests of member organisations (Heinemann, 2004) and/or other pressure groups. In order to understand social acting and social structures (Giddens 1984) of sport federations, two levels are in the focus of our analysis: the macro level examining the environment at large (political, social, economic systems etc.) and the meso level (Esser, 1999) examining organisational structures, actions and decisions of the federation’s headquarter as well as member organisations. METHODOLOGY, RESEARCH DESIGN AND DATA ANALYSIS The multi-level framework mentioned seeks to gather and analyse information on causes, forms and consequences of professionalization processes in sport federations. It is applied in a twofold approach: first an exploratory study based on nine semi-structured interviews with experts from umbrella sport organisations (IOC, WADA, ASOIF, AIOWF, etc.) as well as the analysis of related documents, relevant reports (IOC report 2000 on governance reform, Agenda 2020, etc.) and important moments of change in the Olympic Movement (Olympic revenue share, IOC evaluation criteria, etc.); and secondly several case studies. Whereas the exploratory study seeks more the causes for professionalization on an external, internal and headquarter level as depicted in the literature, the case studies rather focus on forms and consequences. Applying our conceptual framework, the analysis of forms is built around three dimensions: 1) Individuals (persons and positions), 2) Processes, structures (formalisation, specialisation), 3) Activities (strategic planning). With regard to consequences, we centre our attention on expectations of and relationships with stakeholders (e.g. cooperation with business partners), structure, culture and processes (e.g. governance models, performance), and expectations of and relationships with member organisations (e.g. centralisation vs. regionalisation). For the case studies, a mixed-method approach is applied to collect relevant data: questionnaires for rather quantitative data, interviews for rather qualitative data, as well as document and observatory analysis. RESULTS, DISCUSSION AND IMPLICATIONS/CONCLUSIONS With regard to causes of professionalization processes, we analyse the content of three different levels: 1. the external level, where the main pressure derives from financial resources (stakeholders, benefactors) and important turning points (scandals, media pressure, IOC requirements for Olympic sports); 2. the internal level, where pressure from member organisations turned out to be less decisive than assumed (little involvement of member organisations in decision-making); 3. the headquarter level, where specific economic models (World Cups, other international circuits, World Championships), and organisational structures (decision-making procedures, values, leadership) trigger or hinder a federation’s professionalization process. Based on our first analysis, an outline for an economic model is suggested, distinguishing four categories of IFs: “money-generating IFs” being rather based on commercialisation and strategic alliances; “classical Olympic IFs” being rather reactive and dependent on Olympic revenue; “classical non-Olympic IFs” being rather independent of the Olympic Movement; and “money-receiving IFs” being dependent on benefactors and having strong traditions and values. The results regarding forms and consequences will be outlined in the presentation. The first results from the two pilot studies will allow us to refine our conceptual framework for subsequent case studies, thus extending our data collection and developing fundamental conclusions. References: Bayle, E., & Robinson, L. (2007). A framework for understanding the performance of national governing bodies of sport. European Sport Management Quarterly, 7, 249–268 Chantelat, P. (2001). La professionnalisation des organisations sportives: Nouveaux débats, nouveaux enjeux [Professionalisation of sport organisations]. Paris: L’Harmattan. Dowling, M., Edwards, J., & Washington, M. (2014). Understanding the concept of professionalization in sport management research. Sport Management Review. Advance online publication. doi: 10.1016/j.smr.2014.02.003 Ferkins, L. & Shilbury, D. (2012). Good Boards Are Strategic: What Does That Mean for Sport Governance? Journal of Sport Management, 26, 67-80. Thibault, L., Slack, T., & Hinings, B. (1991). Professionalism, structures and systems: The impact of professional staff on voluntary sport organizations. International Review for the Sociology of Sport, 26, 83–97.