16 resultados para convex subgraphs

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper draws attention for the fact that traditional Data Envelopment Analysis (DEA) models do not provide the closest possible targets (or peers) to inefficient units, and presents a procedure to obtain such targets. It focuses on non-oriented efficiency measures (which assume that production units are able to control, and thus change, inputs and outputs simultaneously) both measured in relation to a Free Disposal Hull (FDH) technology and in relation to a convex technology. The approaches developed for finding close targets are applied to a sample of Portuguese bank branches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The title of Juana Castro’s poetry book publshed in 1978, Cóncava mujer — Concave woman — expresses the hollow nature of the social female subject. From Juana Castro’s point of view, this female social concavity is only allowed to transformed itself into its opposite, the convex women which clearly represents the reproductive role of the female body. These two extreme roles assigned to women, hollowness or maternity, are the poetic paradigms for Juana Castro’s two poetry books analised in this article. As if we were presented with the two sides of a coin, Cóncava mujer and Del dolor y las alas – On anguish and wings--(1982) reflect the author´s concious realisation of the above-mentioned female duality as a defined and percieved subject by male society. Each poetry book, however, respond to two different personal moments, and each result in two different ways of conceiving poetic language. On one hand, the poetic subject of Cóncava mujer emerges with all its force as a feminist voice whose goal is the attack of all aspects of the patriarchal society as the cause of the female concavity. On the other hand, in Del dolor y las alas the poetic voice unfolds her motherhood as both loss and creation: the death of Juana Castro’s son makes the poetic subject incomplete, and therefore a concave one; whereas the poetic discourse appears as the perfect way to occupy the empty space left by the son’s death.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study tests the implications of tournament theory using data on 100 U.K. stock market companies, covering over 500 individual executives, in the late 1990s. Our results provide some evidence consistent with the operation of tournament mechanisms within the U.K. business context. Firstly, we find a convex relationship between executive pay and organizational level and secondly, that the gap between CEO pay and other board executives (i.e., tournament prize) is positively related to the number of participants in the tournament. However, we also show that the variation in executive team pay has little role in determining company performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Linear Programming (LP) is a powerful decision making tool extensively used in various economic and engineering activities. In the early stages the success of LP was mainly due to the efficiency of the simplex method. After the appearance of Karmarkar's paper, the focus of most research was shifted to the field of interior point methods. The present work is concerned with investigating and efficiently implementing the latest techniques in this field taking sparsity into account. The performance of these implementations on different classes of LP problems is reported here. The preconditional conjugate gradient method is one of the most powerful tools for the solution of the least square problem, present in every iteration of all interior point methods. The effect of using different preconditioners on a range of problems with various condition numbers is presented. Decomposition algorithms has been one of the main fields of research in linear programming over the last few years. After reviewing the latest decomposition techniques, three promising methods were chosen the implemented. Sparsity is again a consideration and suggestions have been included to allow improvements when solving problems with these methods. Finally, experimental results on randomly generated data are reported and compared with an interior point method. The efficient implementation of the decomposition methods considered in this study requires the solution of quadratic subproblems. A review of recent work on algorithms for convex quadratic was performed. The most promising algorithms are discussed and implemented taking sparsity into account. The related performance of these algorithms on randomly generated separable and non-separable problems is also reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Glass reinforced plastic (GRP) is now an established material for the fabrication of sonar windows. Its good mechanical strength, light weight, resistance to corrosion and acoustic transparency, are all properties which fit it for this application. This thesis describes a study, undertaken at the Royal Naval Engineering College, Plymouth, into the mechanical behaviour of a circular cylindrical sonar panel. This particular type of panel would be used to cover a flank array sonar in a ship or submarine. The case considered is that of a panel with all of its edges mechanically clamped and subject to pressure loading on its convex surface. A comprehensive program of testing, to determine the orthotropic elastic properties of the laminated composite panel material is described, together with a series of pressure tests on 1:5 scale sonar panels. These pressure tests were carried out in a purpose designed test rig, using air pressure to provide simulated hydrostatic and hydrodynamic loading. Details of all instrumentation used in the experimental work are given in the thesis. The experimental results from the panel testing are compared with predictions of panel behaviour obtained from both the Galerkin solution of Flugge's cylindrical shell equations (orthotropic case), and finite element modelling of the panels using PAFEC. A variety of appropriate panel boundary conditions are considered in each case. A parametric study, intended to be of use as a preliminary design tool, and based on the above Galerkin solution, is also presented. This parametric study considers cases of boundary conditions, material properties, and panel geometry, outside of those investigated in the experimental work Final conclusions are drawn and recommendations made regarding possible improvements to the procedures for design, manufacture and fixing of sonar panels in the Royal Navy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present thesis evaluates various aspects of videokeratoscopes, which are now becoming increasingly popular in the investigation of corneal topography. The accuracy and repeatability of these instruments has been assessed mainly using spherical surfaces, however, few studies have assessed the performance of videokeratoscopes in measuring convex aspheric surfaces. Using two videokeratoscopes, the accuracy and repeatability of measurements using twelve aspheric surfaces is determined. Overall, the accuracy and repeatability of both instruments were acceptable, however, progressively flatter surfaces introduced greater errors in measurement. The possible reasons for these errors are discussed. The corneal surface is a biological structure lubricated by the precorneal tear film. The effects of variations in the tear film on the repeatability of videokeratoscopes have not been determined in terms of peripheral corneal measurements. The repeatability of two commercially available videokeratoscopes is assessed. The repeatability is found to be dependent on the point of measurement on the corneal surface. Typically, superior and nasal meridians exhibit poorest repeatability. It is suggested that interference of the ocular adnexa is responsible for the reduced repeatability. This localised reduction in repeatability will occur for all videokeratoscopes. Further, comparison with the keratometers and videokeratoscopes used show that measurements between these instruments are not interchangeable. The final stage of this thesis evaluates the performance of new algorithms. The characteristics of a new videokeratoscope are described. This videokeratoscope is used to test the accuracy of the new algorithms for twelve aspheric surfaces. The new algorithms are accurate in determining the shape of aspheric surfaces, more so than those algorithms proposed at present.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electroretinogram evoked by reversal pattern stimulation (rPERG) is known to contain both pattern contrast and luminance related components. The retinal mechanisms of the transient rPERGs subserving these functional characteristics are the main concern in the present studies. Considerable attention has been paid to the luminance-related characteristics of the response. The transient PERGs were found to consist of two subsequent processes using low frequency attenuation analysis. The processes overlapped and the individual difference in each process timings formed the major cause for the variations of the negative potential waveform of the transient rPERGs. Attention has been paid to those having ‘notch’ type of variation. Under different contrast levels, the amplitudes of the positive and negative potentials were linearly increased with higher contrast level and the negative potential showed a higher sensitivity to contrast changes and higher contrast gain. Under lower contrast levels, the decreased amplitudes made the difference in the timing course of the positive and negative processes evident, interpreting the appearance of the notch in some cases. Visual adaptation conditions for recording the transient rPERG were discussed. Another effort was to study the large variation of the transient rPERGs (especially the positive potential, P50) in the elderly who’s distant and near visual acuity were normal. It was found that reduction of retinal illumination contributed mostly to the P50 amplitude loss and contrast loss mostly to the negative potential (N95) amplitude loss. Senile miosis was thought to have little effect on the reduction of the retinal illumination, while the changes in the optics of the eye was probably the major cause for it, which interpreted the larger individual variation of the P50 amplitude of the elderly PERGs. Convex defocus affected the transient rPERGs more effectively than concave lenses, especially the N95 amplitude in the elderly. The disability of accommodation and the type and the degree of subjects’ ametropia should be taken into consideration when the elderly rPERGs were analysed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates a cross-layer design approach for minimizing energy consumption and maximizing network lifetime (NL) of a multiple-source and single-sink (MSSS) WSN with energy constraints. The optimization problem for MSSS WSN can be formulated as a mixed integer convex optimization problem with the adoption of time division multiple access (TDMA) in medium access control (MAC) layer, and it becomes a convex problem by relaxing the integer constraint on time slots. Impacts of data rate, link access and routing are jointly taken into account in the optimization problem formulation. Both linear and planar network topologies are considered for NL maximization (NLM). With linear MSSS and planar single-source and single-sink (SSSS) topologies, we successfully use Karush-Kuhn-Tucker (KKT) optimality conditions to derive analytical expressions of the optimal NL when all nodes are exhausted simultaneously. The problem for planar MSSS topology is more complicated, and a decomposition and combination (D&C) approach is proposed to compute suboptimal solutions. An analytical expression of the suboptimal NL is derived for a small scale planar network. To deal with larger scale planar network, an iterative algorithm is proposed for the D&C approach. Numerical results show that the upper-bounds of the network lifetime obtained by our proposed optimization models are tight. Important insights into the NL and benefits of cross-layer design for WSN NLM are obtained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate investment behaviour the present study applies panel data techniques, in particular the Arellano-Bond (1991) GMM estimator, based on data on Estonian manufacturing firms from the period 1995-1999. We employ the model of optimal capital accumulation in the presence of convex adjustment costs. The main research findings are that domestic companies seem to be financially more constrained than those where foreign investors are present, and also, smaller firms are more constrained than their larger counterparts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose and demonstrate a technique for monitoring the recovery deformation of the shape-memory polymers (SMP) using a surface-attached fiber Bragg grating (FBG) as a vector-bending sensor. The proposed sensing scheme could monitor the pure bending deformation for the SMP sample. When the SMP sample undergoes concave or convex bending, the resonance wavelength of the FBG will have red-shift or blue-shift according to the tensile or compressive stress gradient along the FBG. As the results show, the bending sensitivity is around 4.07  nm/cm−1. The experimental results clearly indicate that the deformation of such an SMP sample can be effectively monitored by the attached FBG not just for the bending curvature but also the bending direction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A generalized Drucker–Prager (GD–P) viscoplastic yield surface model was developed and validated for asphalt concrete. The GD–P model was formulated based on fabric tensor modified stresses to consider the material inherent anisotropy. A smooth and convex octahedral yield surface function was developed in the GD–P model to characterize the full range of the internal friction angles from 0° to 90°. In contrast, the existing Extended Drucker–Prager (ED–P) was demonstrated to be applicable only for a material that has an internal friction angle less than 22°. Laboratory tests were performed to evaluate the anisotropic effect and to validate the GD–P model. Results indicated that (1) the yield stresses of an isotropic yield surface model are greater in compression and less in extension than that of an anisotropic model, which can result in an under-prediction of the viscoplastic deformation; and (2) the yield stresses predicted by the GD–P model matched well with the experimental results of the octahedral shear strength tests at different normal and confining stresses. By contrast, the ED–P model over-predicted the octahedral yield stresses, which can lead to an under-prediction of the permanent deformation. In summary, the rutting depth of an asphalt pavement would be underestimated without considering anisotropy and convexity of the yield surface for asphalt concrete. The proposed GD–P model was demonstrated to be capable of overcoming these limitations of the existing yield surface models for the asphalt concrete.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Unwanted spike noise in a digital signal is a common problem in digital filtering. However, sometimes the spikes are wanted and other, superimposed, signals are unwanted, and linear, time invariant (LTI) filtering is ineffective because the spikes are wideband - overlapping with independent noise in the frequency domain. So, no LTI filter can separate them, necessitating nonlinear filtering. However, there are applications in which the noise includes drift or smooth signals for which LTI filters are ideal. We describe a nonlinear filter formulated as the solution to an elastic net regularization problem, which attenuates band-limited signals and independent noise, while enhancing superimposed spikes. Making use of known analytic solutions a novel, approximate path-following algorithm is given that provides a good, filtered output with reduced computational effort by comparison to standard convex optimization methods. Accurate performance is shown on real, noisy electrophysiological recordings of neural spikes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a new edge-based matching kernel for graphs by using discrete-time quantum walks. To this end, we commence by transforming a graph into a directed line graph. The reasons of using the line graph structure are twofold. First, for a graph, its directed line graph is a dual representation and each vertex of the line graph represents a corresponding edge in the original graph. Second, we show that the discrete-time quantum walk can be seen as a walk on the line graph and the state space of the walk is the vertex set of the line graph, i.e., the state space of the walk is the edges of the original graph. As a result, the directed line graph provides an elegant way of developing new edge-based matching kernel based on discrete-time quantum walks. For a pair of graphs, we compute the h-layer depth-based representation for each vertex of their directed line graphs by computing entropic signatures (computed from discrete-time quantum walks on the line graphs) on the family of K-layer expansion subgraphs rooted at the vertex, i.e., we compute the depth-based representations for edges of the original graphs through their directed line graphs. Based on the new representations, we define an edge-based matching method for the pair of graphs by aligning the h-layer depth-based representations computed through the directed line graphs. The new edge-based matching kernel is thus computed by counting the number of matched vertices identified by the matching method on the directed line graphs. Experiments on standard graph datasets demonstrate the effectiveness of our new kernel.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose and numerically demonstrate a novel simple method to produce optical Nyquist pulses based on pulse shaping in a passively mode-locked fiber laser with an in-cavity flat-top spectral filter. The proposed scheme takes advantage of the nonlinear in-cavity dynamics of the laser and offers the possibility to generate high-quality sinc-shaped pulses with widely tunable bandwidth directly from the laser oscillator. We also show that the use of a filter with a corrective convex profile relaxes the need for large nonlinear phase accumulation in the cavity by offsetting the concavity of the nonlinearly broadened pulse spectrum.