983 resultados para parametric implicit vector equilibrium problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of two dimensional staggered unstructured discretisation schemes for the solution of fluid flow and heat transfer problems have been developed. All schemes store and solve velocity vector components at cell faces with scalar variables solved at cell centres. The velocity is resolved into face-normal and face-parallel components and the various schemes investigated differ in the treatment of the parallel component. Steady-state and time-dependent fluid flow and thermal energy equations are solved with the well known pressure correction scheme, SIMPLE, employed to couple continuity and momentum. The numerical methods developed are tested on well known benchmark cases: the Lid-Driven Cavity, Natural Convection in a Cavity and Melting of Gallium in a rectangular domain. The results obtained are shown to be comparable to benchmark, but with accuracy dependent on scheme selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The peace process in Northern Ireland has been hailed, variously, as the successful resolution to one of the world's most intractable conflicts, and as a failed attempt to reconcile the conflciting claims of the two main ethnonationalist communities. At both these points, and at every other point along the continuum, recognition is given to the centrality of education. This article looks at the role played by adult learning, and contrasts two fundamentally different apporaoches. In one, Enlightenment assumptions about the power of knowledge to dispel prejudice have run alongside attempts to create a world of shared values; in the other, a postmodern acceptance of different cultures has accompnaied a peace process that builds upon ethnic diistinctions. As with the Dayton Accord and with other peace agreements brokered with international assistance, the consociational model of governance has been chosen for Northern Ireland in order to create a political equilibrium between the unionists and nationalists. Such a political framework reverses the direction of previous integrationist educational policies in favour of a celebration of difference, an approach that is fraught with difficulties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the use of a peer research methodology to explore disaffected young people’s views on alternative education. This model was adopted in order to try to ensure an equilibrium of power between interviewer and interviewee, allow marginalised young people’s voices to be heard and help generate social action. The approach is examined from the perspective of both the peer research and adult research teams. An experiential and honest account is given including the problems and successes, as well as the lessons learned. The paper concludes by considering the value of the model, whether it helps to reach those alienated from education and any evidence that it provides an opportunity for them to have a stake in their future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a new hierarchical learning structure, namely the holistic triple learning (HTL), for extending the binary support vector machine (SVM) to multi-classification problems. For an N-class problem, a HTL constructs a decision tree up to a depth of A leaf node of the decision tree is allowed to be placed with a holistic triple learning unit whose generalisation abilities are assessed and approved. Meanwhile, the remaining nodes in the decision tree each accommodate a standard binary SVM classifier. The holistic triple classifier is a regression model trained on three classes, whose training algorithm is originated from a recently proposed implementation technique, namely the least-squares support vector machine (LS-SVM). A major novelty with the holistic triple classifier is the reduced number of support vectors in the solution. For the resultant HTL-SVM, an upper bound of the generalisation error can be obtained. The time complexity of training the HTL-SVM is analysed, and is shown to be comparable to that of training the one-versus-one (1-vs.-1) SVM, particularly on small-scale datasets. Empirical studies show that the proposed HTL-SVM achieves competitive classification accuracy with a reduced number of support vectors compared to the popular 1-vs-1 alternative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this paper is to present an approach for rating the quality of the parameters in a computer-aided design model for use as optimization variables. Parametric Effectiveness is computed as the ratio of change in performance achieved by perturbing the parameters in the optimum way, to the change in performance that would be achieved by allowing the boundary of the model to move without the constraint on shape change enforced by the CAD parameterization. The approach is applied in this paper to optimization based on adjoint shape sensitivity analyses. The derivation of parametric effectiveness is presented for optimization both with and without the constraint of constant volume. In both cases, the movement of the boundary is normalized with respect to a small root mean squared movement of the boundary. The approach can be used to select an initial search direction in parameter space, or to select sets of model parameters which have the greatest ability to improve model performance. The approach is applied to a number of example 2D and 3D FEA and CFD problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Support vector machines (SVMs), though accurate, are not preferred in applications requiring high classification speed or when deployed in systems of limited computational resources, due to the large number of support vectors involved in the model. To overcome this problem we have devised a primal SVM method with the following properties: (1) it solves for the SVM representation without the need to invoke the representer theorem, (2) forward and backward selections are combined to approach the final globally optimal solution, and (3) a criterion is introduced for identification of support vectors leading to a much reduced support vector set. In addition to introducing this method the paper analyzes the complexity of the algorithm and presents test results on three public benchmark problems and a human activity recognition application. These applications demonstrate the effectiveness and efficiency of the proposed algorithm.


--------------------------------------------------------------------------------

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, gradient vector flow (GVF) based algorithms have been successfully used to segment a variety of 2-D and 3-D imagery. However, due to the compromise of internal and external energy forces within the resulting partial differential equations, these methods may lead to biased segmentation results. In this paper, we propose MSGVF, a mean shift based GVF segmentation algorithm that can successfully locate the correct borders. MSGVF is developed so that when the contour reaches equilibrium, the various forces resulting from the different energy terms are balanced. In addition, the smoothness constraint of image pixels is kept so that over- or under-segmentation can be reduced. Experimental results on publicly accessible datasets of dermoscopic and optic disc images demonstrate that the proposed method effectively detects the borders of the objects of interest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the results of an investigation into the utility of remote sensing (RS) using meteorological satellites sensors and spatial interpolation (SI) of data from meteorological stations, for the prediction of spatial variation in monthly climate across continental Africa in 1990. Information from the Advanced Very High Resolution Radiometer (AVHRR) of the National Oceanic and Atmospheric Administration's (NOAA) polar-orbiting meteorological satellites was used to estimate land surface temperature (LST) and atmospheric moisture. Cold cloud duration (CCD) data derived from the High Resolution Radiometer (HRR) onboard the European Meteorological Satellite programme's (EUMETSAT) Meteosat satellite series were also used as a RS proxy measurement of rainfall. Temperature, atmospheric moisture and rainfall surfaces were independently derived from SI of measurements from the World Meteorological Organization (WMO) member stations of Africa. These meteorological station data were then used to test the accuracy of each methodology, so that the appropriateness of the two techniques for epidemiological research could be compared. SI was a more accurate predictor of temperature, whereas RS provided a better surrogate for rainfall; both were equally accurate at predicting atmospheric moisture. The implications of these results for mapping short and long-term climate change and hence their potential for the study anti control of disease vectors are considered. Taking into account logistic and analytical problems, there were no clear conclusions regarding the optimality of either technique, but there was considerable potential for synergy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Arc-Length Method is a solution procedure that enables a generic non-linear problem to pass limit points. Some examples are provided of mode-jumping problems solutions using a commercial nite element package, and other investigations are carried out on a simple structure of which the numerical solution can be compared with an analytical one. It is shown that Arc-Length Method is not reliable when bifurcations are present in the primary equilibrium path; also the presence of very sharp snap-backs or special boundary conditions may cause convergence diÆculty at limit points. An improvement to the predictor used in the incremental procedure is suggested, together with a reliable criteria for selecting either solution of the quadratic arc-length constraint. The gap that is sometimes observed between the experimantal load level of mode-jumping and its arc-length prediction is explained through an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research presents a fast algorithm for projected support vector machines (PSVM) by selecting a basis vector set (BVS) for the kernel-induced feature space, the training points are projected onto the subspace spanned by the selected BVS. A standard linear support vector machine (SVM) is then produced in the subspace with the projected training points. As the dimension of the subspace is determined by the size of the selected basis vector set, the size of the produced SVM expansion can be specified. A two-stage algorithm is derived which selects and refines the basis vector set achieving a locally optimal model. The model expansion coefficients and bias are updated recursively for increase and decrease in the basis set and support vector set. The condition for a point to be classed as outside the current basis vector and selected as a new basis vector is derived and embedded in the recursive procedure. This guarantees the linear independence of the produced basis set. The proposed algorithm is tested and compared with an existing sparse primal SVM (SpSVM) and a standard SVM (LibSVM) on seven public benchmark classification problems. Our new algorithm is designed for use in the application area of human activity recognition using smart devices and embedded sensors where their sometimes limited memory and processing resources must be exploited to the full and the more robust and accurate the classification the more satisfied the user. Experimental results demonstrate the effectiveness and efficiency of the proposed algorithm. This work builds upon a previously published algorithm specifically created for activity recognition within mobile applications for the EU Haptimap project [1]. The algorithms detailed in this paper are more memory and resource efficient making them suitable for use with bigger data sets and more easily trained SVMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a convex problem of Semi-Infinite Programming (SIP) with multidimensional index set. In study of this problem we apply the approach suggested in [20] for convex SIP problems with one-dimensional index sets and based on the notions of immobile indices and their immobility orders. For the problem under consideration we formulate optimality conditions that are explicit and have the form of criterion. We compare this criterion with other known optimality conditions for SIP and show its efficiency in the convex case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A relação entre a epidemiologia, a modelação matemática e as ferramentas computacionais permite construir e testar teorias sobre o desenvolvimento e combate de uma doença. Esta tese tem como motivação o estudo de modelos epidemiológicos aplicados a doenças infeciosas numa perspetiva de Controlo Ótimo, dando particular relevância ao Dengue. Sendo uma doença tropical e subtropical transmitida por mosquitos, afecta cerca de 100 milhões de pessoas por ano, e é considerada pela Organização Mundial de Saúde como uma grande preocupação para a saúde pública. Os modelos matemáticos desenvolvidos e testados neste trabalho, baseiam-se em equações diferenciais ordinárias que descrevem a dinâmica subjacente à doença nomeadamente a interação entre humanos e mosquitos. É feito um estudo analítico dos mesmos relativamente aos pontos de equilíbrio, sua estabilidade e número básico de reprodução. A propagação do Dengue pode ser atenuada através de medidas de controlo do vetor transmissor, tais como o uso de inseticidas específicos e campanhas educacionais. Como o desenvolvimento de uma potencial vacina tem sido uma aposta mundial recente, são propostos modelos baseados na simulação de um hipotético processo de vacinação numa população. Tendo por base a teoria de Controlo Ótimo, são analisadas as estratégias ótimas para o uso destes controlos e respetivas repercussões na redução/erradicação da doença aquando de um surto na população, considerando uma abordagem bioeconómica. Os problemas formulados são resolvidos numericamente usando métodos diretos e indiretos. Os primeiros discretizam o problema reformulando-o num problema de optimização não linear. Os métodos indiretos usam o Princípio do Máximo de Pontryagin como condição necessária para encontrar a curva ótima para o respetivo controlo. Nestas duas estratégias utilizam-se vários pacotes de software numérico. Ao longo deste trabalho, houve sempre um compromisso entre o realismo dos modelos epidemiológicos e a sua tratabilidade em termos matemáticos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear Algebra—Selected Problems is a unique book for senior undergraduate and graduate students to fast review basic materials in Linear Algebra. Vector spaces are presented first, and linear transformations are reviewed secondly. Matrices and Linear systems are presented. Determinants and Basic geometry are presented in the last two chapters. The solutions for proposed excises are listed for readers to references.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.