23 resultados para model complexity

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the business model complexity of Irish credit unions using a latent class approach to measure structural performance over the period 2002 to 2013. The latent class approach allows the endogenous identification of a multi-class framework for business models based on credit union specific characteristics. The analysis finds a three class system to be appropriate with the multi-class model dependent on three financial viability characteristics. This finding is consistent with the deliberations of the Irish Commission on Credit Unions (2012) which identified complexity and diversity in the business models of Irish credit unions and recommended that such complexity and diversity could not be accommodated within a one size fits all regulatory framework. The analysis also highlights that two of the classes are subject to diseconomies of scale. This may suggest credit unions would benefit from a reduction in scale or perhaps that there is an imbalance in the present change process. Finally, relative performance differences are identified for each class in terms of technical efficiency. This suggests that there is an opportunity for credit unions to improve their performance by using within-class best practice or alternatively by switching to another class.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexitymodels as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this work we explore optimising parameters of a physical circuit model relative to input/output measurements, using the Dallas Rangemaster Treble Booster as a case study. A hybrid metaheuristic/gradient descent algorithm is implemented, where the initial parameter sets for the optimisation are informed by nominal values from schematics and datasheets. Sensitivity analysis is used to screen parameters, which informs a study of the optimisation algorithm against model complexity by fixing parameters. The results of the optimisation show a significant increase in the accuracy of model behaviour, but also highlight several key issues regarding the recovery of parameters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the identification of complex dynamic systems using fuzzy neural networks, one of the main issues is the curse of dimensionality, which makes it difficult to retain a large number of system inputs or to consider a large number of fuzzy sets. Moreover, due to the correlations, not all possible network inputs or regression vectors in the network are necessary and adding them simply increases the model complexity and deteriorates the network generalisation performance. In this paper, the problem is solved by first proposing a fast algorithm for selection of network terms, and then introducing a refinement procedure to tackle the correlation issue. Simulation results show the efficacy of the method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The influence of predation in structuring ecological communities can be informed by examining the shape and magnitude of the functional response of predators towards prey. We derived functional responses of the ubiquitous intertidal amphipod Echinogammarus marinus towards one of its preferred prey species, the isopod Jaera nordmanni. First, we examined the form of the functional response where prey were replaced following consumption, as compared to the usual experimental design where prey density in each replicate is allowed to deplete. E. marinus exhibited Type II functional responses, i.e. inversely density-dependent predation of J. nordmanni that increased linearly with prey availability at low densities, but decreased with further prey supply. In both prey replacement and non-replacement experiments, handling times and maximum feeding rates were similar. The non-replacement design underestimated attack rates compared to when prey were replaced. We then compared the use of Holling’s disc equation (assuming constant prey density) with the more appropriate Rogers’ random predator equation (accounting for prey depletion) using the prey non-replacement data. Rogers’ equation returned significantly greater attack rates but lower maximum feeding rates, indicating that model choice has significant implications for parameter estimates. We then manipulated habitat complexity and found significantly reduced predation by the amphipod in complex as opposed to simple habitat structure. Further, the functional response changed from a Type II in simple habitats to a sigmoidal, density-dependent Type III response in complex habitats, which may impart stability on the predator−prey interaction. Enhanced habitat complexity returned significantly lower attack rates, higher handling times and lower maximum feeding rates. These findings illustrate the sensitivity of the functional response to variations in prey supply, model selection and habitat complexity and, further, that E. marinus could potentially determine the local exclusion and persistence of prey through habitat-mediated changes in its predatory functional responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose The aim of this paper is to explore the issues involved in developing and applying performance management approaches within a large UK public sector department using a multiple stakeholder perspective and an accompanying theoretical framework. Design/methodology/approach An initial short questionnaire was used to determine perceptions about the implementation and effectiveness of the new performance management system across the organisation. In total, 700 questionnaires were distributed. Running concurrently with an ethnographic approach, and informed by the questionnaire responses, was a series of semi-structured interviews and focus groups. Findings Staff at all levels had an understanding of the new system and perceived it as being beneficial. However, there were concerns that the approach was not continuously managed throughout the year and was in danger of becoming an annual event, rather than an ongoing process. Furthermore, the change process seemed to have advanced without corresponding changes to appraisal and reward and recognition systems. Thus, the business objectives were not aligned with motivating factors within the organisation. Research limitations/implications Additional research to test the validity and usefulness of the theoretical model, as discussed in this paper, would be beneficial. Practical implications The strategic integration of the stakeholder performance measures and scorecards was found to be essential to producing an overall stakeholder-driven strategy within the case study organisation. Originality/value This paper discusses in detail the approach adopted and the progress made by one large UK public sector organisation, as it attempts to develop better relationships with all of its stakeholders and hence improve its performance. This paper provides a concerted attempt to link theory with practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of nonlinear dynamic systems using linear-in-the-parameters models is studied. A fast recursive algorithm (FRA) is proposed to select both the model structure and to estimate the model parameters. Unlike orthogonal least squares (OLS) method, FRA solves the least-squares problem recursively over the model order without requiring matrix decomposition. The computational complexity of both algorithms is analyzed, along with their numerical stability. The new method is shown to require much less computational effort and is also numerically more stable than OLS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Food webs are networks describing who is eating whom in an ecological community. By now it is clear that many aspects of food-web structure are reproducible across diverse habitats, yet little is known about the driving force behind this structure. Evolutionary and population dynamical mechanisms have been considered. We propose a model for the evolutionary dynamics of food-web topology and show that it accurately reproduces observed food-web characteristics in the steady state. It is based on the observation that most consumers are larger than their resource species and the hypothesis that speciation and extinction rates decrease with increasing body mass. Results give strong support to the evolutionary hypothesis. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relevant mouse models of E2a-PBX1-induced pre-B cell leukemia are still elusive. We now report the generation of a pre-B leukemia model using E2a-PBX1 transgenic mice, which lack mature and precursor T-cells as a result of engineered loss of CD3epsilon expression (CD3epsilon(-/-)). Using insertional mutagenesis and inverse-PCR, we show that B-cell leukemia development in the E2a-PBX1 x CD3epsilon(-/-) compound transgenic animals is significantly accelerated when compared to control littermates, and document several known and novel integrations in these tumors. Of all common integration sites, a small region of 19 kb in the Hoxa gene locus, mostly between Hoxa6 and Hoxa10, represented 18% of all integrations in the E2a-PBX1 B-cell leukemia and was targeted in 86% of these leukemias compared to 17% in control tumors. Q-PCR assessment of expression levels for most Hoxa cluster genes in these tumors revealed an unprecedented impact of the proviral integrations on Hoxa gene expression, with tumors having one to seven different Hoxa genes overexpressed at levels up to 6600-fold above control values. Together our studies set the stage for modeling E2a-PBX1-induced B-cell leukemia and shed new light on the complexity pertaining to Hox gene regulation. In addition, our results show that the Hoxa gene cluster is preferentially targeted in E2a-PBX1-induced tumors, thus suggesting functional collaboration between these oncogenes in pre-B-cell tumors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is based on research into the transition of young people leaving public care in Romania. Using this specific country example, the paper aims to contribute to present understandings of the psycho-social transition of young people from care to independent living by introducing the use of Bridges (2002) to build on existing theories and literature. The research discussed involved mixed methods design and was implemented in three phases: semi-structured interviews with 34 care leavers, focus groups with 32 professionals, and a professional-service user working group. The overall findings confirmed that young people experience two different, but interconnected transitions - social and psychological - which take place at different paces. A number of theoretical perpectives are explored to make sense of this transition including attachment theory, focal theory and identity. In addition, a new model for understanding the complex process of transitions was adapted from Bridges’ (2002) to capture the clear complexity of transition which the findings demonstrated in terms of their psycho-social transition. The paper concludes with messages for leaving and after care services with an emphasis on managing the psycho-social transition from care to independent living.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new chemical model of the circumstellar envelope surrounding the carbon-rich star IRC+10216 has been developed. This model incorporates a variety of newly measured rapid neutral-neutral reactions between carbon atoms and hydrocarbons and between the radical CN and a variety of stable neutral molecules. In addition, other neutral-neutral reactions in the above two classes or involving atoms such as N or radicals such as C(2n)H have been included with large rate coefficients although they have not yet been studied in the laboratory. Unlike the interstellar case, where the inclusion of these neutral-neutral reactions destroys molecular complexity, our model results for IRC+10216 show that sufficient abundances of large hydrocarbon radicals and cyanpolyynes can be produced to explain observations. We also discuss the formation of H2CN and NH2CN, two potentially observable molecules in IRC+10216.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated examination timetabling has been addressed by a wide variety of methodologies and techniques over the last ten years or so. Many of the methods in this broad range of approaches have been evaluated on a collection of benchmark instances provided at the University of Toronto in 1996. Whilst the existence of these datasets has provided an invaluable resource for research into examination timetabling, the instances have significant limitations in terms of their relevance to real-world examination timetabling in modern universities. This paper presents a detailed model which draws upon experiences of implementing examination timetabling systems in universities in Europe, Australasia and America. This model represents the problem that was presented in the 2nd International Timetabling Competition (ITC2007). In presenting this detailed new model, this paper describes the examination timetabling track introduced as part of the competition. In addition to the model, the datasets used in the competition are also based on current real-world instances introduced by EventMAP Limited. It is hoped that the interest generated as part of the competition will lead to the development, investigation and application of a host of novel and exciting techniques to address this important real-world search domain. Moreover, the motivating goal of this paper is to close the currently existing gap between theory and practice in examination timetabling by presenting the research community with a rigorous model which represents the complexity of the real-world situation. In this paper we describe the model and its motivations, followed by a full formal definition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyses regularly feature claims that European welfare states are in the process of creating an adult worker model. The theoretical and empirical basis of this argument is examined here by looking first at the conceptual foundations of the adult worker model formulation and then at the extent to which social policy reform in western Europe fits with the argument. It is suggested that the adult worker formulation is under-specified. A framework incorporating four dimensions—the treatment of individuals vis-à-vis their family role and status for the purposes of social rights, the treatment of care, the treatment of the family as a social institution, and the extent to which gender inequality is problematized—is developed and then applied. The empirical analysis reveals a strong move towards individualization as social policy promotes and valorizes individual agency and self-sufficiency and shifts some childcare from the family. Yet evidence is also found of continued (albeit changed) familism. Rather than an unequivocal move to an individualized worker model then, a dual earner, gender-specialized, family arrangement is being promoted. The latter is the middle way between the old dependencies and the new “independence.” This makes for complexity and even ambiguity in policy, a manifestation of which is that reform within countries involves concurrent moves in several directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We define a multi-modal version of Computation Tree Logic (ctl) by extending the language with path quantifiers E and A where d denotes one of finitely many dimensions, interpreted over Kripke structures with one total relation for each dimension. As expected, the logic is axiomatised by taking a copy of a ctl axiomatisation for each dimension. Completeness is proved by employing the completeness result for ctl to obtain a model along each dimension in turn. We also show that the logic is decidable and that its satisfiability problem is no harder than the corresponding problem for ctl. We then demonstrate how Normative Systems can be conceived as a natural interpretation of such a multi-dimensional ctl logic. © 2009 Springer Science+Business Media B.V.