995 resultados para optimal scaling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is concerned with the derivation of optimal scaling laws, in the sense of matching lower and upper bounds on the energy, for a solid undergoing ductile fracture. The specific problem considered concerns a material sample in the form of an infinite slab of finite thickness subjected to prescribed opening displacements on its two surfaces. The solid is assumed to obey deformation-theory of plasticity and, in order to further simplify the analysis, we assume isotropic rigid-plastic deformations with zero plastic spin. When hardening exponents are given values consistent with observation, the energy is found to exhibit sublinear growth. We regularize the energy through the addition of nonlocal energy terms of the strain-gradient plasticity type. This nonlocal regularization has the effect of introducing an intrinsic length scale into the energy. We also put forth a physical argument that identifies the intrinsic length and suggests a linear growth of the nonlocal energy. Under these assumptions, ductile fracture emerges as the net result of two competing effects: whereas the sublinear growth of the local energy promotes localization of deformation to failure planes, the nonlocal regularization stabilizes this process, thus resulting in an orderly progression towards failure and a well-defined specific fracture energy. The optimal scaling laws derived here show that ductile fracture results from localization of deformations to void sheets, and that it requires a well-defined energy per unit fracture area. In particular, fractal modes of fracture are ruled out under the assumptions of the analysis. The optimal scaling laws additionally show that ductile fracture is cohesive in nature, i.e., it obeys a well-defined relation between tractions and opening displacements. Finally, the scaling laws supply a link between micromechanical properties and macroscopic fracture properties. In particular, they reveal the relative roles that surface energy and microplasticity play as contributors to the specific fracture energy of the material. Next, we present an experimental assessment of the optimal scaling laws. We show that when the specific fracture energy is renormalized in a manner suggested by the optimal scaling laws, the data falls within the bounds predicted by the analysis and, moreover, they ostensibly collapse---with allowances made for experimental scatter---on a master curve dependent on the hardening exponent, but otherwise material independent.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work considers the identification of the available whitespace, i.e., the regions that do not contain any existing transmitter within a given geographical area. To this end, n sensors are deployed at random locations within the area. These sensors detect for the presence of a transmitter within their radio range r(s) using a binary sensing model, and their individual decisions are combined to estimate the available whitespace. The limiting behavior of the recovered whitespace as a function of n and r(s) is analyzed. It is shown that both the fraction of the available whitespace that the nodes fail to recover as well as their radio range optimally scale as log(n)/n as n gets large. The problem of minimizing the sum absolute error in transmitter localization is also analyzed, and the corresponding optimal scaling of the radio range and the necessary minimum transmitter separation is determined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The problem of cooperative beamforming for maximizing the achievable data rate of an energy constrained two-hop amplify-and-forward (AF) network is considered. Assuming perfect channel state information (CSI) of all the nodes, we evaluate the optimal scaling factor for the relay nodes. Along with individual power constraint on each of the relay nodes, we consider a weighted sum power constraint. The proposed iterative algorithm initially solves a set of relaxed problems with weighted sum power constraint and then updates the solution to accommodate individual constraints. These relaxed problems in turn are solved using a sequence of Quadratic Eigenvalue Problems (QEP). The key contribution of this letter is the generalization of cooperative beamforming to incorporate both the individual and weighted sum constraint. Furthermore, we have proposed a novel algorithm based on Quadratic Eigenvalue Problem (QEP) and discussed its convergence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis aims at a simple one-parameter macroscopic model of distributed damage and fracture of polymers that is amenable to a straightforward and efficient numerical implementation. The failure model is motivated by post-mortem fractographic observations of void nucleation, growth and coalescence in polyurea stretched to failure, and accounts for the specific fracture energy per unit area attendant to rupture of the material.

Furthermore, it is shown that the macroscopic model can be rigorously derived, in the sense of optimal scaling, from a micromechanical model of chain elasticity and failure regularized by means of fractional strain-gradient elasticity. Optimal scaling laws that supply a link between the single parameter of the macroscopic model, namely the critical energy-release rate of the material, and micromechanical parameters pertaining to the elasticity and strength of the polymer chains, and to the strain-gradient elasticity regularization, are derived. Based on optimal scaling laws, it is shown how the critical energy-release rate of specific materials can be determined from test data. In addition, the scope and fidelity of the model is demonstrated by means of an example of application, namely Taylor-impact experiments of polyurea rods. Hereby, optimal transportation meshfree approximation schemes using maximum-entropy interpolation functions are employed.

Finally, a different crazing model using full derivatives of the deformation gradient and a core cut-off is presented, along with a numerical non-local regularization model. The numerical model takes into account higher-order deformation gradients in a finite element framework. It is shown how the introduction of non-locality into the model stabilizes the effect of strain localization to small volumes in materials undergoing softening. From an investigation of craze formation in the limit of large deformations, convergence studies verifying scaling properties of both local- and non-local energy contributions are presented.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabajo analiza, estática y dinámicamente, los cambios en las condiciones de vida de los hogares en Colombia entre los años 1997 y 2003. Se propone un indicador alternativo y se contrasta a nivel nacional, regional y de centro poblacional con el Índice de Calidad de Vida (ICV) convencional. El indicador alterno reconoce la importancia que tienen aspectos de salud y el tiempo gastado en desplazamiento. Estas variables adicionales reducen el impacto de los aspectos concernientes al capital físico y distancian aún más las condiciones de las zonas rurales frente a las urbanas. La importancia de profundizar en las políticas de salud en las zonas rurales y en políticas que permitan disminuir el tiempo gastado en desplazamiento para ambas zonas se torna evidente.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El documento busca mostrar las principales propiedades de un indicador de estándar de vida dentro del enfoque teórico de Amartya Sen. Establecemos un puente entre conceptos tales como: bienestar económico, bienestar, logro de agencia y estándar de vida. La metodología del Optimal scaling fue usada para probar las propiedades de Monotonicidad, No independencia de alternativas irrelevantes, Concavidad, informatividad y sustituibilidad.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The scoring protocol adopted by the MSCEIT V2 has been criticised since its development. The present study raises questions regarding the value of consensus scoring by analysing responses within the categorical subscales of Changes and Blends using the Optimal Scaling technique within the Categorical Principal Components Analysis (CATPCA) via Statistical Package for the Social Sciences (SPSS), (n = 206). On a number of occasions, there was no clear agreement as to the ‘‘correct” response to items within these categorical subscales. Such an issue seems integral to the application of the MSCEIT V2 and one which deserves more attention. On a more positive note, improvements were made to the reliabilities of the subscales of Changes and Blends, using Optimal Scaling, but less so for Changes. Nevertheless, this raises the possibility of improving the reliabilities of other subscales in the MSCEIT V2 and in turn improving the power of subsequent statistical tests.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this study was to gather information about ecstasy users in Brazil, particularly on issues related to risks associated to the use of the drug, so as to offer a basis to prevention projects. A total of 1,140 Brazilian ecstasy users answered an online questionnaire from August 2004 to February 2005. Participants were predominantly young single heterosexual well-educated males from upper economical classes. A categorical regression with optimal scaling (CATREG) was performed to identify the risks associated with ecstasy use. ""Pills taken in life"" had a significant correlation with every investigated risk, particularly ecstasy dependence, unsafe sex, and polydrug use. ""Gender,"" ""sexual orientation,"" and ""socioeconomic class"" were not predictive of risk behavior. The Internet proved to be a useful tool for data collection. Given the recent increase in ecstasy availability in Brazil, a first prevention campaign directed toward the drug is urgent. At least in a preliminary Brazilian intervention, the campaign must be conducted at night leisure places, mainly frequented by youngsters from upper socioeconomic classes. The results do not call for information material with specific targets, such as gender or sexual orientation. The study`s limitations have been noted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este trabalho procura entender um pouco mais e discutir, partindo dos dados disponíveis do Instituto Brasileiro de Geografia e Estatística (IBGE), através de sua pesquisa de economia informal urbana (ECINF) gerada em 2003 (ultima disponível), aspectos ligados à informalidade, crescimento e uso do crédito. Procura explorar, neste contexto, aspectos relevantes ao crescimento e continuidade da micro e pequena empresa. Para atingir este objetivo, procura-se entender inicialmente conceitos sobre estes aspectos (informalidade, crescimento e uso de crédito) e suas características, e através de dados secundários e com o uso de software estatístico, procurar entender as suas relações, através de ANOVA e regressões, bem como optimal scaling contribuindo para o conhecimento deste cenário. Os dados levantandos e comparados com a teoria apontam as relações entre empregador e trabalhador por conta própria indicando dos fatores adotados no estudo quais mais contribuem e os que indicam pouca contribuição dentro deste contexto. Em conclusões acena para uma tendência onde o individamento através do uso de crédito tende a gerar um maior crescimento até um determinado ponto e a partir desta inflecção tende a dificultar a atividade bem como acena dificultar a formalização.(AU)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O trabalho pretende analisar as características das empresas e produtores de caprinos da raça Serpentina e as razões para a continuidade desta atividade, com base num inquérito realizado em 2014. Recorreu-se ao Escalonamento Ótimo (Optimal Scaling em inglês) englobado no SPSS e à seleção de variáveis observadas (package subselect do programa estatístico R) para caracterizar as explorações.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Cross-Entropy (CE) is an efficient method for the estimation of rare-event probabilities and combinatorial optimization. This work presents a novel approach of the CE for optimization of a Soft-Computing controller. A Fuzzy controller was designed to command an unmanned aerial system (UAS) for avoiding collision task. The only sensor used to accomplish this task was a forward camera. The CE is used to reach a near-optimal controller by modifying the scaling factors of the controller inputs. The optimization was realized using the ROS-Gazebo simulation system. In order to evaluate the optimization a big amount of tests were carried out with a real quadcopter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of surface aeration systems, among other key design variables, depends upon the geometric parameters of the aeration tank. Efficient performance and scale up or scale down of the experimental results of an aeration ystem requires optimal geometric conditions. Optimal conditions refer to the conditions of maximum oxygen transfer rate, which assists in scaling up or down the system for ommercial utilization. The present work investigates the effect of an aeration tank's shape (unbaffled circular, baffled circular and unbaffled square) on oxygen transfer. Present results demonstrate that there is no effect of shape on the optimal geometric conditions for rotor position and rotor dimensions. This experimentation shows that circular tanks (baffled or unbaffled) do not have optimal geometric conditions for liquid transfer, whereas the square cross-section tank shows a unique geometric shape to optimize oxygen transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a dense, ad hoc wireless network confined to a small region, such that direct communication is possible between any pair of nodes. The physical communication model is that a receiver decodes the signal from a single transmitter, while treating all other signals as interference. Data packets are sent between source-destination pairs by multihop relaying. We assume that nodes self-organise into a multihop network such that all hops are of length d meters, where d is a design parameter. There is a contention based multiaccess scheme, and it is assumed that every node always has data to send, either originated from it or a transit packet (saturation assumption). In this scenario, we seek to maximize a measure of the transport capacity of the network (measured in bit-meters per second) over power controls (in a fading environment) and over the hop distance d, subject to an average power constraint. We first argue that for a dense collection of nodes confined to a small region, single cell operation is efficient for single user decoding transceivers. Then, operating the dense ad hoc network (described above) as a single cell, we study the optimal hop length and power control that maximizes the transport capacity for a given network power constraint. More specifically, for a fading channel and for a fixed transmission time strategy (akin to the IEEE 802.11 TXOP), we find that there exists an intrinsic aggregate bit rate (Theta(opt) bits per second, depending on the contention mechanism and the channel fading characteristics) carried by the network, when operating at the optimal hop length and power control. The optimal transport capacity is of the form d(opt)((P) over bar (t)) x Theta(opt) with d(opt) scaling as (P) over bar (1/eta)(t), where (P) over bar (t) is the available time average transmit power and eta is the path loss exponent. Under certain conditions on the fading distribution, we then provide a simple characterisation of the optimal operating point.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple Clock Domain processors provide an attractive solution to the increasingly challenging problems of clock distribution and power dissipation. They allow their chips to be partitioned into different clock domains, and each domain’s frequency (voltage) to be independently configured. This flexibility adds new dimensions to the Dynamic Voltage and Frequency Scaling problem, while providing better scope for saving energy and meeting performance demands. In this paper, we propose a compiler directed approach for MCD-DVFS. We build a formal petri net based program performance model, parameterized by settings of microarchitectural components and resource configurations, and integrate it with our compiler passes for frequency selection.Our model estimates the performance impact of a frequency setting, unlike the existing best techniques which rely on weaker indicators of domain performance such as queue occupancies(used by online methods) and slack manifestation for a particular frequency setting (software based methods).We evaluate our method with subsets of SPECFP2000,Mediabench and Mibench benchmarks. Our mean energy savings is 60.39% (versus 33.91% of the best software technique)in a memory constrained system for cache miss dominated benchmarks, and we meet the performance demands.Our ED2 improves by 22.11% (versus 18.34%) for other benchmarks. For a CPU with restricted frequency settings, our energy consumption is within 4.69% of the optimal.