902 resultados para Unconstrained minimization
Resumo:
In this paper, we investigate the remanufacturing problem of pricing single-class used products (cores) in the face of random price-dependent returns and random demand. Specifically, we propose a dynamic pricing policy for the cores and then model the problem as a continuous-time Markov decision process. Our models are designed to address three objectives: finite horizon total cost minimization, infinite horizon discounted cost, and average cost minimization. Besides proving optimal policy uniqueness and establishing monotonicity results for the infinite horizon problem, we also characterize the structures of the optimal policies, which can greatly simplify the computational procedure. Finally, we use computational examples to assess the impacts of specific parameters on optimal price and reveal the benefits of a dynamic pricing policy. © 2013 Elsevier B.V. All rights reserved.
Resumo:
Masses and progenitor evolutionary states of Type II supernovae remain almost unconstrained by direct observations. Only one robust observation of a progenitor (SN 1987A) and one plausible observation (SN 1993J) are available. Neither matched theoretical predictions, and in this Letter we report limits on a third progenitor (SN 1999gi). The Hubble Space Telescope (HST) has imaged the site of the Type II-P supernova SN 1999gi with the Wide Field Planetary Camera 2 (WFPC2) in two filters (F606W and F300W) prior to explosion. The distance to the host galaxy (NGC 3184) of 7.9 Mpc means that the most luminous, massive stars are resolved as single objects in the archive images. The supernova occurred in a resolved, young OB association 2.3 kpc from the center of NGC 3184 with an association age of about 4 Myr. Follow-up images of SN 1999gi with WFPC2 taken 14 months after discovery determine the precise position of the supernova on the preexplosion frames. An upper limit of the absolute magnitude of the progenitor is estimated (M-v greater than or equal to -5.1). By comparison with stellar evolutionary tracks, this can be interpreted as a stellar mass, and we determine an upper mass limit of 9(-2)(+3) M.. We discuss the possibility of determining the masses or mass limits for numerous nearby core-collapse supernovae using the HST archive enhanced by our current SNAP program.
Resumo:
Morphometric study of modern ice masses is useful because many reconstructions of glaciers traditionally draw on their shape for guidance Here we analyse data derived from the surface profiles of 200 modern ice masses-valley glaciers icefields ice caps and ice sheets with length scales from 10º to 10³ km-from different parts of the world Four profile attributes are investigated relief span and two parameters C* and C that result from using Nye s (1952) theoretical parabola as a profile descriptor C* and C respectively measure each profile s aspect ratio and steepness and are found to decrease in size and variability with span This dependence quantifies the competing influences of unconstrained spreading behaviour of ice flow and bed topography on the profile shape of ice masses which becomes more parabolic as span Increases (with C* and C tending to low values of 2.5-3.3 m ½) The same data reveal coherent minimum bounds in C* and C for modern ice masses that we develop into two new methods of palaeo glacier reconstruction In the first method glacial limits are known from moraines and the bounds are used to constrain the lowest palaeo ice surface consistent with modern profiles We give an example of applying this method over a three-dimensional glacial landscape in Kamchatka In the second method we test the plausibility of existing reconstructions by comparing their C* and C against the modern minimum bounds Of the 86 published palaeo ice masses that we put to this test 88% are found to be plausible The search for other morphometric constraints will help us formalise glacier reconstructions and reduce their uncertainty and subjectiveness
Resumo:
Handling appearance variations is a very challenging problem for visual tracking. Existing methods usually solve this problem by relying on an effective appearance model with two features: (1) being capable of discriminating the tracked target from its background, (2) being robust to the target's appearance variations during tracking. Instead of integrating the two requirements into the appearance model, in this paper, we propose a tracking method that deals with these problems separately based on sparse representation in a particle filter framework. Each target candidate defined by a particle is linearly represented by the target and background templates with an additive representation error. Discriminating the target from its background is achieved by activating the target templates or the background templates in the linear system in a competitive manner. The target's appearance variations are directly modeled as the representation error. An online algorithm is used to learn the basis functions that sparsely span the representation error. The linear system is solved via ℓ1 minimization. The candidate with the smallest reconstruction error using the target templates is selected as the tracking result. We test the proposed approach using four sequences with heavy occlusions, large pose variations, drastic illumination changes and low foreground-background contrast. The proposed approach shows excellent performance in comparison with two latest state-of-the-art trackers.
Resumo:
Chemoenzymatic dynamic kinetic resolution (DKR) of rac-1-phenyl ethanol into R-1-phenylethanol acetate was investigated with emphasis on the minimization of side reactions. The organometallic hydrogen transfer (racemization) catalyst was varied, and this was observed to alter the rate and extent of oxidation of the alcohol to form ketone side products. The performance of highly active catalyst [(pentamethylcyclopentadienyl) IrCl2(1-benzyl,3-methyl-imidazol-2-ylidene)] was found to depend on the batch of lipase B used. The interaction between the bio- and chemo-catalysts was reduced by employing physical entrapment of the enzyme in silica using a sol-gel process. The nature of the gelation method was found to be important, with an alkaline method preferred, as an acidic method was found to initiate a further side reaction, the acid catalyzed dehydration of the secondary alcohol. The acidic gel was found to be a heterogeneous solid acid.
Resumo:
This paper introduces the discrete choice model-paradigm of Random Regret Minimisation (RRM) to the field of health economics. The RRM is a regret-based model that explores a driver of choice different from the traditional utility-based Random Utility Maximisation (RUM). The RRM approach is based on the idea that, when choosing, individuals aim to minimise their regret–regret being defined as what one experiences when a non-chosen alternative in a choice set performs better than a chosen one in relation to one or more attributes. Analysing data from a discrete choice experiment on diet, physical activity and risk of a fatal heart attack in the next ten years administered to a sample of the Northern Ireland population, we find that the combined use of RUM and RRM models offer additional information, providing useful behavioural insights for better informed policy appraisal.
Resumo:
Novice and expert jugglers employ different visuomotor strategies: whereas novices look at the balls around their zeniths, experts tend to fixate their gaze at a central location within the pattern (so-called gaze-through). A gaze-through strategy may reflect visuomotor parsimony, i.e., the use of simpler visuomotor (oculomotor and/or attentional) strategies as afforded by superior tossing accuracy and error corrections. In addition, the more stable gaze during a gaze-through strategy may result in more accurate movement planning by providing a stable base for gaze-centered neural coding of ball motion and movement plans or for shifts in attention. To determine whether a stable gaze might indeed have such beneficial effects on juggling, we examined juggling variability during 3-ball cascade juggling with and without constrained gaze fixation (at various depths) in expert performers (n = 5). Novice jugglers were included (n = 5) for comparison, even though our predictions pertained specifically to expert juggling. We indeed observed that experts, but not novices, juggled significantly less variable when fixating, compared to unconstrained viewing. Thus, while visuomotor parsimony might still contribute to the emergence of a gaze-through strategy, this study highlights an additional role for improved movement planning. This role may be engendered by gaze-centered coding and/or attentional control mechanisms in the brain.
The size and shape of shells used by hermit crabs: A multivariate analysis of Clibanarius erythropus
Resumo:
Shell attributes Such as weight and shape affect the reproduction, growth, predator avoidance and behaviour of several hermit crab species. Although the importance of these attributes has been extensively investigated, it is still difficult to assess the relative role of size and shape. Multivariate techniques allow concise and efficient quantitative analysis of these multidimensional properties, and this paper aims to understand their role in determining patterns of hermit crab shell use. To this end, a multivariate approach based on a combination of size-unconstrained (shape) PCA and RDA ordination was used to model the biometrics of southern Mediterranean Clibanarius erythropus Populations and their shells. Patterns of shell utilization and morphological gradients demonstrate that size is more important than shape, probably due to the limited availability of empty shells in the environment. The shape (e.g. the degree of shell elongation) and weight of inhabited shells vary considerably in both female and male crabs. However, these variations are clearly accounted for by crab biometrics in males only. Oil the basis of statistical evidence and findings from past studies. it is hypothesized that larger males of adequate size and strength have access to the larger, heavier and relatively more available shells of the globose Osilinus turbinatus, which cannot be used by average-sized males or by females investing energy in egg production. This greater availability allows larger males to select more Suitable Shapes. (C) 2009 Elsevier Masson SAS. All rights reserved.
Resumo:
Drilling is a major process in the manufacturing of holes required for the assemblies of composite laminates in aerospace industry. Simulation of drilling process is an effective method in optimizing the drill geometry and process parameters in order to improve hole quality and to reduce the drill wear. In this research we have developed three-dimensional (3D) FE model for drilling CFRP. A 3D progressive intra-laminar failure model based on the Hashin's theory is considered. Also an inter-laminar delamination model which includes the onset and growth of delamination by using cohesive contact zone is developed. The developed model with inclusion of the improved delamination model and real drill geometry is used to make comparison between the step drill of different stage ratio and twist drill. Thrust force, torque and work piece stress distributions are estimated to decrease by the use of step drill with high stage ratio. The model indicates that delamination and other workpiece defects could be controlled by selection of suitable step drill geometry. Hence the 3D model could be used as a design tool for drill geometry for minimization of delamination in CFRP drilling. © 2013 Elsevier Ltd.
Resumo:
We present an implementation of quantum annealing (QA) via lattice Green's function Monte Carlo (GFMC), focusing on its application to the Ising spin glass in transverse field. In particular, we study whether or not such a method is more effective than the path-integral Monte Carlo- (PIMC) based QA, as well as classical simulated annealing (CA), previously tested on the same optimization problem. We identify the issue of importance sampling, i.e., the necessity of possessing reasonably good (variational) trial wave functions, as the key point of the algorithm. We performed GFMC-QA runs using such a Boltzmann-type trial wave function, finding results for the residual energies that are qualitatively similar to those of CA (but at a much larger computational cost), and definitely worse than PIMC-QA. We conclude that, at present, without a serious effort in constructing reliable importance sampling variational wave functions for a quantum glass, GFMC-QA is not a true competitor of PIMC-QA.
Resumo:
We present results for a variety of Monte Carlo annealing approaches, both classical and quantum, benchmarked against one another for the textbook optimization exercise of a simple one-dimensional double well. In classical (thermal) annealing, the dependence upon the move chosen in a Metropolis scheme is studied and correlated with the spectrum of the associated Markov transition matrix. In quantum annealing, the path integral Monte Carlo approach is found to yield nontrivial sampling difficulties associated with the tunneling between the two wells. The choice of fictitious quantum kinetic energy is also addressed. We find that a "relativistic" kinetic energy form, leading to a higher probability of long real-space jumps, can be considerably more effective than the standard nonrelativistic one.
Resumo:
In this paper, I critically assess John Rawls' repeated claim that the duty of civility is only a moral duty and should not be enforced by law. In the first part of the paper, I examine and reject the view that Rawls' position may be due to the practical difficulties that the legal enforcement of the duty of civility might entail. I thus claim that Rawls' position must be driven by deeper normative reasons grounded in a conception of free speech. In the second part of the paper, I therefore examine various arguments for free speech and critically assess whether they are consistent with Rawls' political liberalism. I first focus on the arguments from truth and self-fulfilment. Both arguments, I argue, rely on comprehensive doctrines and therefore cannot provide a freestanding political justification for free speech. Freedom of speech, I claim, can be justified instead on the basis of Rawls' political conception of the person and of the two moral powers. However, Rawls' wide view of public reason already allows scope for the kind of free speech necessary for the exercise of the two moral powers and therefore cannot explain Rawls' opposition to the legal enforcement of the duty of civility. Such opposition, I claim, can only be explained on the basis of a defence of unconstrained freedom of speech grounded in the ideas of democracy and political legitimacy. Yet, I conclude, while public reason and the duty of civility are essential to political liberalism, unconstrained freedom of speech is not. Rawls and political liberals could therefore renounce unconstrained freedom of speech, and endorse the legal enforcement of the duty of civility, while remaining faithful to political liberalism.
Resumo:
Many graph datasets are labelled with discrete and numeric attributes. Most frequent substructure discovery algorithms ignore numeric attributes; in this paper we show how they can be used to improve search performance and discrimination. Our thesis is that the most descriptive substructures are those which are normative both in terms of their structure and in terms of their numeric values. We explore the relationship between graph structure and the distribution of attribute values and propose an outlier-detection step, which is used as a constraint during substructure discovery. By pruning anomalous vertices and edges, more weight is given to the most descriptive substructures. Our method is applicable to multi-dimensional numeric attributes; we outline how it can be extended for high-dimensional data. We support our findings with experiments on transaction graphs and single large graphs from the domains of physical building security and digital forensics, measuring the effect on runtime, memory requirements and coverage of discovered patterns, relative to the unconstrained approach.
Resumo:
Due to increasing water scarcity, accelerating industrialization and urbanization, efficiency of irrigation water use in Northern China needs urgent improvement. Based on a sample of 347 wheat growers in the Guanzhong Plain, this paper simultaneously estimates a production function, and its corresponding first-order conditions for cost minimization, to analyze efficiency of irrigation water use. The main findings are that average technical, allocative, and overall economic efficiency are 0.35, 0.86 and 0.80, respectively. In a second stage analysis, we find that farmers’ perception of water scarcity, water price and irrigation infrastructure increase irrigation water allocative efficiency, while land fragmentation decreases it. We also show that farmers’ income loss due to higher water prices can be offset by increasing irrigation water use efficiency.
Resumo:
Sparse representation based visual tracking approaches have attracted increasing interests in the community in recent years. The main idea is to linearly represent each target candidate using a set of target and trivial templates while imposing a sparsity constraint onto the representation coefficients. After we obtain the coefficients using L1-norm minimization methods, the candidate with the lowest error, when it is reconstructed using only the target templates and the associated coefficients, is considered as the tracking result. In spite of promising system performance widely reported, it is unclear if the performance of these trackers can be maximised. In addition, computational complexity caused by the dimensionality of the feature space limits these algorithms in real-time applications. In this paper, we propose a real-time visual tracking method based on structurally random projection and weighted least squares techniques. In particular, to enhance the discriminative capability of the tracker, we introduce background templates to the linear representation framework. To handle appearance variations over time, we relax the sparsity constraint using a weighed least squares (WLS) method to obtain the representation coefficients. To further reduce the computational complexity, structurally random projection is used to reduce the dimensionality of the feature space while preserving the pairwise distances between the data points in the feature space. Experimental results show that the proposed approach outperforms several state-of-the-art tracking methods.