989 resultados para parametric implicit vector equilibrium problems
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.
Resumo:
The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.
Resumo:
Abstract. In this paper we study the relative equilibria and their stability for a system of three point particles moving under the action of a Lennard{Jones potential. A central con guration is a special position of the particles where the position and acceleration vectors of each particle are proportional, and the constant of proportionality is the same for all particles. Since the Lennard{Jones potential depends only on the mutual distances among the particles, it is invariant under rotations. In a rotating frame the orbits coming from central con gurations become equilibrium points, the relative equilibria. Due to the form of the potential, the relative equilibria depend on the size of the system, that is, depend strongly of the momentum of inertia I. In this work we characterize the relative equilibria, we nd the bifurcation values of I for which the number of relative equilibria is changing, we also analyze the stability of the relative equilibria.
Resumo:
Products developed at industries, institutes and research centers are expected to have high level of quality and performance, having a minimum waste, which require efficient and robust tools to numerically simulate stringent project conditions with great reliability. In this context, Computational Fluid Dynamics (CFD) plays an important role and the present work shows two numerical algorithms that are used in the CFD community to solve the Euler and Navier-Stokes equations applied to typical aerospace and aeronautical problems. Particularly, unstructured discretization of the spatial domain has gained special attention by the international community due to its ease in discretizing complex spatial domains. This work has the main objective of illustrating some advantages and disadvantages of numerical algorithms using structured and unstructured spatial discretization of the flow governing equations. Numerical methods include a finite volume formulation and the Euler and Navier-Stokes equations are applied to solve a transonic nozzle problem, a low supersonic airfoil problem and a hypersonic inlet problem. In a structured context, these problems are solved using MacCormacks implicit algorithm with Steger and Warmings flux vector splitting technique, while, in an unstructured context, Jameson and Mavriplis explicit algorithm is used. Convergence acceleration is obtained using a spatially variable time stepping procedure.
Resumo:
The immersed boundary method is a versatile tool for the investigation of flow-structure interaction. In a large number of applications, the immersed boundaries or structures are very stiff and strong tangential forces on these interfaces induce a well-known, severe time-step restriction for explicit discretizations. This excessive stability constraint can be removed with fully implicit or suitable semi-implicit schemes but at a seemingly prohibitive computational cost. While economical alternatives have been proposed recently for some special cases, there is a practical need for a computationally efficient approach that can be applied more broadly. In this context, we revisit a robust semi-implicit discretization introduced by Peskin in the late 1970s which has received renewed attention recently. This discretization, in which the spreading and interpolation operators are lagged. leads to a linear system of equations for the inter-face configuration at the future time, when the interfacial force is linear. However, this linear system is large and dense and thus it is challenging to streamline its solution. Moreover, while the same linear system or one of similar structure could potentially be used in Newton-type iterations, nonlinear and highly stiff immersed structures pose additional challenges to iterative methods. In this work, we address these problems and propose cost-effective computational strategies for solving Peskin`s lagged-operators type of discretization. We do this by first constructing a sufficiently accurate approximation to the system`s matrix and we obtain a rigorous estimate for this approximation. This matrix is expeditiously computed by using a combination of pre-calculated values and interpolation. The availability of a matrix allows for more efficient matrix-vector products and facilitates the design of effective iterative schemes. We propose efficient iterative approaches to deal with both linear and nonlinear interfacial forces and simple or complex immersed structures with tethered or untethered points. One of these iterative approaches employs a splitting in which we first solve a linear problem for the interfacial force and then we use a nonlinear iteration to find the interface configuration corresponding to this force. We demonstrate that the proposed approach is several orders of magnitude more efficient than the standard explicit method. In addition to considering the standard elliptical drop test case, we show both the robustness and efficacy of the proposed methodology with a 2D model of a heart valve. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
A vector-valued impulsive control problem is considered whose dynamics, defined by a differential inclusion, are such that the vector fields associated with the singular term do not satisfy the so-called Frobenius condition. A concept of robust solution based on a new reparametrization procedure is adopted in order to derive necessary conditions of optimality. These conditions are obtained by taking a limit of those for an appropriate sequence of auxiliary standard optimal control problems approximating the original one. An example to illustrate the nature of the new optimality conditions is provided. © 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Mathematical programming problems with equilibrium constraints (MPEC) are nonlinear programming problems where the constraints have a form that is analogous to first-order optimality conditions of constrained optimization. We prove that, under reasonable sufficient conditions, stationary points of the sum of squares of the constraints are feasible points of the MPEC. In usual formulations of MPEC all the feasible points are nonregular in the sense that they do not satisfy the Mangasarian-Fromovitz constraint qualification of nonlinear programming. Therefore, all the feasible points satisfy the classical Fritz-John necessary optimality conditions. In principle, this can cause serious difficulties for nonlinear programming algorithms applied to MPEC. However, we show that most feasible points do not satisfy a recently introduced stronger optimality condition for nonlinear programming. This is the reason why, in general, nonlinear programming algorithms are successful when applied to MPEC.
Resumo:
The Implicit Association Test (IAT) had already gained the status of a prominent assessment procedure before its psychometric properties and underlying task structure were understood. The present critique addresses five major problems that arise when the IAT is used for diagnostic inferences: (1) the asymmetry of causal and diagnostic inferences; (2) the viability of the underlying association model; (3) the lack of a testable model underlying IAT-based inferences; (4) the difficulties of interpreting difference scores; and (5) the susceptibility of the IAT to deliberate faking and strategic processing. Based on a theoretical reflection of these issues, and a comprehensive survey of published IAT studies, it is concluded that a number of uncontrolled factors can produce (or reduce) significant IAT scores independently of the personality attribute that is supposed to be captured by the IAT procedure.
Resumo:
This is a set of P. Chem. problems posed at a slightly higher level than the normal textbook level, for students who are continuing in the study of this subject.
Resumo:
This paper studies the change-point problem for a general parametric, univariate or multivariate family of distributions. An information theoretic procedure is developed which is based on general divergence measures for testing the hypothesis of the existence of a change. For comparing the exact sizes of the new test-statistic using the criterion proposed in Dale (J R Stat Soc B 48–59, 1986), a simulation study is performed for the special case of exponentially distributed random variables. A complete study of powers of the test-statistics and their corresponding relative local efficiencies, is also considered.
Resumo:
Mode of access: Internet.
Resumo:
A szerző röviden összefoglalja a származtatott termékek árazásával kapcsolatos legfontosabb ismereteket és problémákat. A derivatív árazás elmélete a piacon levő termékek közötti redundanciát kihasználva próbálja meghatározni az egyes termékek relatív árát. Ezt azonban csak teljes piacon lehet megtenni, és így csak teljes piac esetén lehetséges a hasznossági függvények fogalmát az elméletből és a ráépülő gyakorlatból elhagyni, ezért a kockázatsemleges árazás elve félrevezető. Másképpen fogalmazva: a származtatott termékek elmélete csak azon az áron képes a hasznossági függvény fogalmától megszabadulni, ha a piac szerkezetére a valóságban nem teljesülő megkötéseket tesz. Ennek hangsúlyozása mind a piaci gyakorlatban, mind az oktatásban elengedhetetlen. / === / The author sums up briefly the main aspects and problems to do with the pricing of derived products. The theory of derivative pricing uses the redundancy among products on the market to arrive at relative product prices. But this can be done only on a complete market, so that only with a complete market does it become possible to omit from the theory and the practice built upon it the concept of utility functions, and for that reason the principle of risk-neutral pricing is misleading. To put it another way, the theory of derived products is capable of freeing itself from the concept of utility functions only at a price where in practice it places impossible restrictions on the market structure. This it is essential to emphasize in market practice and in teaching.
Resumo:
We consider a parametric semilinear Dirichlet problem driven by the Laplacian plus an indefinite unbounded potential and with a reaction of superdifissive type. Using variational and truncation techniques, we show that there exists a critical parameter value λ_{∗}>0 such that for all λ> λ_{∗} the problem has least two positive solutions, for λ= λ_{∗} the problem has at least one positive solutions, and no positive solutions exist when λ∈(0,λ_{∗}). Also, we show that for λ≥ λ_{∗} the problem has a smallest positive solution.
Resumo:
We consider a parametric nonlinear Neumann problem driven by a nonlinear nonhomogeneous differential operator and with a Caratheodory reaction $f\left( t,x\right) $ which is $p-$superlinear in $x$ without satisfying the usual in such cases Ambrosetti-Rabinowitz condition. We prove a bifurcation type result describing the dependence of the positive solutions on the parameter $\lambda>0,$ we show the existence of a smallest positive solution $\overline{u}_{\lambda}$ and investigate the properties of the map $\lambda\rightarrow\overline{u}_{\lambda}.$ Finally we also show the existence of nodal solutions.