76 resultados para NETWORK DESIGN PROBLEMS
em CentAUR: Central Archive University of Reading - UK
Resumo:
The next couple of years will see the need for replacement of a large amount of life-expired switchgear on the UK 11 kV distribution system. Latest technology and alternative equipment have made the choice of replacement a complex task. The authors present an expert system as an aid to the decision process for the design of the 11 kV power distribution network.
Resumo:
This report describes the analysis and development of novel tools for the global optimisation of relevant mission design problems. A taxonomy was created for mission design problems, and an empirical analysis of their optimisational complexity performed - it was demonstrated that the use of global optimisation was necessary on most classes and informed the selection of appropriate global algorithms. The selected algorithms were then applied to the di®erent problem classes: Di®erential Evolution was found to be the most e±cient. Considering the speci¯c problem of multiple gravity assist trajectory design, a search space pruning algorithm was developed that displays both polynomial time and space complexity. Empirically, this was shown to typically achieve search space reductions of greater than six orders of magnitude, thus reducing signi¯cantly the complexity of the subsequent optimisation. The algorithm was fully implemented in a software package that allows simple visualisation of high-dimensional search spaces, and e®ective optimisation over the reduced search bounds.
Resumo:
The research will explore views on inclusive design policy implementation and learning strategy used in practice by Local Authorities’ planning, building control and policy departments in England. It reports emerging research findings. The research aim was developed from an extensive literature review, and informed by a pilot study with relevant Local Authority departments. The pilot study highlighted gaps within the process of policy implementation, a lack of awareness of the process and flaws in the design guidance policy. This has helped inform the development of a robust research design using both a survey and semi-structured interviews. The questionnaire targeted key employees within Local Authorities designed to establish how employees learn about inclusive design policy and to determine their views on current approaches of inclusive design policy implementation adopted by their Local Authorities. The questionnaire produces 117 responses. Interestingly approximately 9 out of 129 Local Authorities approached claimed that they were unable to participate either because an inclusive design policy was not adopted or they were faced with a high workload and thus unable to take part. An emerging finding is a lack of understanding of inclusive design problems, which may lead to problem with inclusive design policy implementation, and thus adversely affect how the built environment can be experienced. There is a strong indication from the survey respondents indicating that they are most likely to learn about inclusive design from policy guides produced by their Local Authorities and from their colleagues.
Resumo:
Design patterns are a way of sharing evidence-based solutions to educational design problems. The design patterns presented in this paper were produced through a series of workshops, which aimed to identify Massive Open Online Course (MOOC) design principles from workshop participants’ experiences of designing, teaching and learning on these courses. MOOCs present a challenge for the existing pedagogy of online learning, particularly as it relates to promoting peer interaction and discussion. MOOC cohort sizes, participation patterns and diversity of learners mean that discussions can remain superficial, become difficult to navigate, or never develop beyond isolated posts. In addition, MOOC platforms may not provide sufficient tools to support moderation. This paper draws on four case studies of designing and teaching on a range of MOOCs presenting seven design narratives relating to the experience in these MOOCs. Evidence presented in the narratives is abstracted in the form of three design patterns created through a collaborative process using techniques similar to those used in collective autoethnography. The patterns: “Special Interest Discussions”, “Celebrity Touch” and “Look and Engage”, draw together shared lessons and present possible solutions to the problem of creating, managing and facilitating meaningful discussion in MOOCs through the careful use of staged learning activities and facilitation strategies.
Resumo:
Very large scale scheduling and planning tasks cannot be effectively addressed by fully automated schedule optimisation systems, since many key factors which govern 'fitness' in such cases are unformalisable. This raises the question of an interactive (or collaborative) approach, where fitness is assigned by the expert user. Though well-researched in the domains of interactively evolved art and music, this method is as yet rarely used in logistics. This paper concerns a difficulty shared by all interactive evolutionary systems (IESs), but especially those used for logistics or design problems. The difficulty is that objective evaluation of IESs is severely hampered by the need for expert humans in the loop. This makes it effectively impossible to, for example, determine with statistical confidence any ranking among a decent number of configurations for the parameters and strategy choices. We make headway into this difficulty with an Automated Tester (AT) for such systems. The AT replaces the human in experiments, and has parameters controlling its decision-making accuracy (modelling human error) and a built-in notion of a target solution which may typically be at odds with the solution which is optimal in terms of formalisable fitness. Using the AT, plausible evaluations of alternative designs for the IES can be done, allowing for (and examining the effects of) different levels of user error. We describe such an AT for evaluating an IES for very large scale planning.
Resumo:
Studies on learning management systems have largely been technical in nature with an emphasis on the evaluation of the human computer interaction (HCI) processes in using the LMS. This paper reports a study that evaluates the information interaction processes on an eLearning course used in teaching an applied Statistics course. The eLearning course is used as a synonym for information systems. The study explores issues of missing context in stored information in information systems. Using the semiotic framework as a guide, the researchers evaluated an existing eLearning course with the view to proposing a model for designing improved eLearning courses for future eLearning programmes. In this exploratory study, a survey questionnaire is used to collect data from 160 participants on an eLearning course in Statistics in Applied Climatology. The views of the participants are analysed with a focus on only the human information interaction issues. Using the semiotic framework as a guide, syntactic, semantic, pragmatic and social context gaps or problems were identified. The information interactions problems identified include ambiguous instructions, inadequate information, lack of sound, interface design problems among others. These problems affected the quality of new knowledge created by the participants. The researchers thus highlighted the challenges of missing information context when data is stored in an information system. The study concludes by proposing a human information interaction model for improving the information interaction quality issues in the design of eLearning course on learning management platforms and those other information systems.
Resumo:
From a construction innovation systems perspective, firms acquire knowledge from suppliers, clients, universities and institutional environment. Building information modelling (BIM) involves these firms using new process standards. To understand the implications on interactive learning using BIM process standards, a case study is conducted with the UK operations of a multinational construction firm. Data is drawn from: a) two workshops involving the firm and a wider industry group, b) observations of practice in the BIM core team and in three ongoing projects, c) 12 semi-structured interviews; and d) secondary publications. The firm uses a set of BIM process standards (IFC, PAS 1192, Uniclass, COBie) in its construction activities. It is also involved in a pilot to implement the COBie standard, supported by technical and management standards for BIM, such as Uniclass and PAS1192. Analyses suggest that such BIM process standards unconsciously shapes the firm's internal and external interactive learning processes. Internally standards allow engineers to learn from each through visualising 3D information and talking around designs with operatives to address problems during construction. Externally, the firm participates in trial and pilot projects involving other construction firms, government agencies, universities and suppliers to learn about the standard and access knowledge to solve its specific design problems. Through its BIM manager, the firm provides feedback to standards developers and information technology suppliers. The research contributes by articulating how BIM process standards unconsciously change interactive learning processes in construction practice. Further research could investigate these findings in the wider UK construction innovation system.
Resumo:
In the recent years, the unpredictable growth of the Internet has moreover pointed out the congestion problem, one of the problems that historicallyha ve affected the network. This paper deals with the design and the evaluation of a congestion control algorithm which adopts a FuzzyCon troller. The analogyb etween Proportional Integral (PI) regulators and Fuzzycon trollers is discussed and a method to determine the scaling factors of the Fuzzycon troller is presented. It is shown that the Fuzzycon troller outperforms the PI under traffic conditions which are different from those related to the operating point considered in the design.
Resumo:
Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.
Resumo:
A construction algorithm for multioutput radial basis function (RBF) network modelling is introduced by combining a locally regularised orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximised model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of producing a very parsimonious RBF network model with excellent generalisation performance. The D-optimality design criterion enhances the model efficiency and robustness. A further advantage of the combined approach is that the user only needs to specify a weighting for the D-optimality cost in the combined RBF model selecting criterion and the entire model construction procedure becomes automatic. The value of this weighting does not influence the model selection procedure critically and it can be chosen with ease from a wide range of values.
Resumo:
This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.