959 resultados para Average Case Complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated annealing (SA) is an optimization technique that can process cost functions with degrees of nonlinearities, discontinuities and stochasticity. It can process arbitrary boundary conditions and constraints imposed on these cost functions. The SA technique is applied to the problem of robot path planning. Three situations are considered here: the path is represented as a polyline; as a Bezier curve; and as a spline interpolated curve. In the proposed SA algorithm, the sensitivity of each continuous parameter is evaluated at each iteration increasing the number of accepted solutions. The sensitivity of each parameter is associated to its probability distribution in the definition of the next candidate. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various steel chain links presented cracking during their manufacturing process, which includes induction case hardening and electrogalvanizing steps. Fractographic examination of the exposed crack surfaces revealed intergranular cracking with some areas featuring a thin layer of iron oxide, indicating that the cracking took place after the electrogalvanizing step. The location of the cracks coincided with the position of the deepest case hardened layer, suggesting the occurrence of localized overheating during the induction case hardening step. Inductive heating finite element analysis (COSMOS Designstar Software) confirmed that during the case hardening the austenitising temperature reached in the crack region values of approximately 1050 degrees C. The results indicated that intergranular cracking was caused by hydrogen embrittlement. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Cluster Variation Method (CVM), introduced over 50 years ago by Prof. Dr. Ryoichi Kikuchi, is applied to the thermodynamic modeling of the BCC Cr-Fe system in the irregular tetrahedron approximation, using experimental thermochemical data as initial input for accessing the model parameters. The results are checked against independent data on the low-temperature miscibility gap, using increasingly accurate thermodynamic models, first by the inclusion of the magnetic degrees of freedom of iron and then also by the inclusion of the magnetic degrees of freedom of chromium. It is shown that a reasonably accurate description of the phase diagram at the iron-rich side (i.e. the miscibility gap borders and the Curie line) is obtained, but only at expense of the agreement with the above mentioned thermochemical data. Reasons for these inconsistencies are discussed, especially with regard to the need of introducing vibrational degrees of freedom in the CVM model. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The procedure for online process control by attributes consists of inspecting a single item at every m produced items. It is decided on the basis of the inspection result whether the process is in-control (the conforming fraction is stable) or out-of-control (the conforming fraction is decreased, for example). Most articles about online process control have cited the stoppage of the production process for an adjustment when the inspected item is non-conforming (then the production is restarted in-control, here denominated as corrective adjustment). Moreover, the articles related to this subject do not present semi-economical designs (which may yield high quantities of non-conforming items), as they do not include a policy of preventive adjustments (in such case no item is inspected), which can be more economical, mainly if the inspected item can be misclassified. In this article, the possibility of preventive or corrective adjustments in the process is decided at every m produced item. If a preventive adjustment is decided upon, then no item is inspected. On the contrary, the m-th item is inspected; if it conforms, the production goes on, otherwise, an adjustment takes place and the process restarts in-control. This approach is economically feasible for some practical situations and the parameters of the proposed procedure are determined minimizing an average cost function subject to some statistical restrictions (for example, to assure a minimal levelfixed in advanceof conforming items in the production process). Numerical examples illustrate the proposal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this paper is to apply the so-called policy iteration algorithm (PIA) for the long run average continuous control problem of piecewise deterministic Markov processes (PDMP`s) taking values in a general Borel space and with compact action space depending on the state variable. In order to do that we first derive some important properties for a pseudo-Poisson equation associated to the problem. In the sequence it is shown that the convergence of the PIA to a solution satisfying the optimality equation holds under some classical hypotheses and that this optimal solution yields to an optimal control strategy for the average control problem for the continuous-time PDMP in a feedback form.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is concerned with the existence of an optimal control strategy for the long-run average continuous control problem of piecewise-deterministic Markov processes (PDMPs). In Costa and Dufour (2008), sufficient conditions were derived to ensure the existence of an optimal control by using the vanishing discount approach. These conditions were mainly expressed in terms of the relative difference of the alpha-discount value functions. The main goal of this paper is to derive tractable conditions directly related to the primitive data of the PDMP to ensure the existence of an optimal control. The present work can be seen as a continuation of the results derived in Costa and Dufour (2008). Our main assumptions are written in terms of some integro-differential inequalities related to the so-called expected growth condition, and geometric convergence of the post-jump location kernel associated to the PDMP. An example based on the capacity expansion problem is presented, illustrating the possible applications of the results developed in the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A network of Kuramoto oscillators with different natural frequencies is optimized for enhanced synchronizability. All node inputs are normalized by the node connectivity and some important properties of the network Structure are determined in this case: (i) optimized networks present a strong anti-correlation between natural frequencies of adjacent nodes: (ii) this anti-correlation should be as high as possible since the average path length between nodes is maintained as small as in random networks: and (iii) high anti-correlation is obtained without any relation between nodes natural frequencies and the degree of connectivity. We also propose a network construction model with which it is shown that high anti-correlation and small average paths may be achieved by randomly rewiring a fraction of the links of a totally anti-correlated network, and that these networks present optimal synchronization properties. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This essay is a trial on giving some mathematical ideas about the concept of biological complexity, trying to explore four different attributes considered to be essential to characterize a complex system in a biological context: decomposition, heterogeneous assembly, self-organization, and adequacy. It is a theoretical and speculative approach, opening some possibilities to further numerical and experimental work, illustrated by references to several researches that applied the concepts presented here. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This essay is a trial on measuring complexity in a three-trophic level system by using a convex function of the informational entropy. The complexity measure defined here is compatible with the fact that real complexity lies between ordered and disordered states. Applying this measure to the data collected for two three-trophic level systems some hints about their organization are obtained. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter addresses the optimization and complexity reduction of switch-reconfigured antennas. A new optimization technique based on graph models is investigated. This technique is used to minimize the redundancy in a reconfigurable antenna structure and reduce its complexity. A graph modeling rule for switch-reconfigured antennas is proposed, and examples are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Highly redundant or statically undetermined structures, such as a cable-stayed bridge, have been of particular concern to the engineering community nowadays because of the complex parameters that must be taken into account for healthy monitoring. The purpose of this study was to verify the reliability and practicability of using GPS to characterize dynamic oscillations of small span bridges. The test was carried out on a cable-stayed wood footbridge at Escola de Engenharia de Sao Carlos-Universidade de Sao Paulo, Brazil. Initially a static load trial was carried out to get an idea of the deck amplitude and oscillation frequency. After that, a calibration trial was carried out by applying a well known oscillation on the rover antenna to check the environment detectable limits for the method used. Finally, a dynamic load trial was carried out by using GPS and a displacement transducer to measure the deck oscillation. The displacement transducer was used just to confirm the results obtained by the GPS. The results have shown that the frequencies and amplitude displacements obtained by the GPS are in good agreement with the displacement transducer responses. GPS can be used as a reliable tool to characterize the dynamic behavior of large structures such as cable-stayed footbridges undergoing dynamic loads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article discusses the impact on the profitability of firms under Complementary Law 102/2000 (which abrogated the Law 89/96 - Kandir Law) allowing the appropriation of ICMS credits, due to investment in fixed assets goods, at a ratio of 1/48 per month. The paper seeks to demonstrate how this new system - which resulted in the transformation of the ICMS as a value added tax (VAT) consumption-type to an income-type - leads to a loss of approximately 30% of the value of credits to be recovered and the effect it generates on the cost of investment and the profits for small, medium and large firms. From the methodological point of view, it is a descriptive and quantitative research, which proceeded in three stages. Initially, we have obtained estimated value of net sales and volume of investments, based on report Painel de Competitividade prepared by the Federacao das Indtustrias do Estado de Sao Paulo (Fiesp/Serasa). Based on this information, it was possible to obtain estimates of the factors of generation of debits and credits for ICMS, using the model Credit Control of Fixed Assets (CIAP). Finally, we have calculated three indicators: (i) present value of debt recovery/value of credits, (ii) present value of debt recovery / investment value, (iii) present value of debt recovery / sales profitability. We have conclude that the system introduced by Complementary Law 102/2000 implicates great opportunity cost for firms and that legislation should be reviewed from this perspective, aiming to ensure lower costs associated with investment projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Rondonia State, Brazil, settlement processes have cleared 68,000 km 2 of tropical forests since the 1970s. The intensity of deforestation has differed by region depending on driving factors like roads and economic activities. Different histories of land-use activities and rates of change have resulted in mosaics of forest patches embedded in an agricultural matrix. Yet, most assessments of deforestation and its effects on vegetation, soil and water typically focus on landscape patterns of current conditions, yet historical deforestation dynamics can influence current conditions strongly. Here, we develop and describe the use of four land-use dynamic indicators to capture historical land-use changes of catchments and to measure the rate of deforestation (annual deforestation rate), forest regeneration level (secondary forest mean proportion), time since disturbance (mean time since deforestation) and deforestation profile (deforestation profile curvature). We used the proposed indices to analyze a watershed located in central Rondonia. Landsat TM and ETM+ images were used to produce historical land-use maps of the last 18 years, each even year from 1984 to 2002 for 20 catchments. We found that the land-use dynamics indicators are able to distinguish catchments with different land-use change profiles. Four categories of historical land-use were identified: old and dominant pasture cover on small properties, recent deforestation and dominance of secondary growth, old extensive pastures and large forest remnants and, recent deforestation, pasture and large forest remnants. Knowing historical deforestation processes is important to develop appropriate conservation strategies and define priorities and actions for conserving forests currently under deforestation. (C) 2009 Elsevier B.V. All rights reserved.