28 resultados para Non-negative rational numbers

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is drawn from the use of data envelopment analysis (DEA) in helping a Portuguese bank to manage the performance of its branches. The bank wanted to set targets for the branches on such variables as growth in number of clients, growth in funds deposited and so on. Such variables can take positive and negative values but apart from some exceptions, traditional DEA models have hitherto been restricted to non-negative data. We report on the development of a model to handle unrestricted data in a DEA framework and illustrate the use of this model on data from the bank concerned.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the reformation of spectrum policy and the development of cognitive radio, secondary users will be allowed to access spectrums licensed to primary users. Spectrum auctions can facilitate this secondary spectrum access in a market-driven way. To design an efficient auction framework, we first study the supply and demand pressures and the competitive equilibrium of the secondary spectrum market, considering the spectrum reusability. In well-designed auctions, competition among participants should lead to the competitive equilibrium according to the traditional economic point of view. Then, a discriminatory price spectrum double auction framework is proposed for this market. In this framework, rational participants compete with each other by using bidding prices, and their profits are guaranteed to be non-negative. A near-optimal heuristic algorithm is also proposed to solve the auction clearing problem of the proposed framework efficiently. Experimental results verify the efficiency of the proposed auction clearing algorithm and demonstrate that competition among secondary users and primary users can lead to the competitive equilibrium during auction iterations using the proposed auction framework. Copyright © 2011 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years Data Envelopment Analysis (DEA) has been gaining increasing popularity as a tool for measuring efficiency and productivity of Decision Making Units (DMUs). Conventional DEA models assume non-negative inputs and outputs. However, in many real applications, some inputs and/or outputs can take negative values. Recently, Emrouznejad et al. [6] introduced a Semi-Oriented Radial Measure (SORM) for modelling DEA with negative data. This paper points out some issues in target setting with SORM models and introduces a modified SORM approach. An empirical study in bank sector demonstrates the applicability of the proposed model. © 2014 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supply chain formation is the process by which a set of producers within a network determine the subset of these producers able to form a chain to supply goods to one or more consumers at the lowest cost. This problem has been tackled in a number of ways, including auctions, negotiations, and argumentation-based approaches. In this paper we show how this problem can be cast as an optimization of a pairwise cost function. Optimizing this class of energy functions is NP-hard but efficient approximations to the global minimum can be obtained using loopy belief propagation (LBP). Here we detail a max-sum LBP-based approach to the supply chain formation problem, involving decentralized message-passing between supply chain participants. Our approach is evaluated against a well-known decentralized double-auction method and an optimal centralized technique, showing several improvements on the auction method: it obtains better solutions for most network instances which allow for competitive equilibrium (Competitive equilibrium in Walsh and Wellman is a set of producer costs which permits a Pareto optimal state in which agents in the allocation receive non-negative surplus and agents not in the allocation would acquire non-positive surplus by participating in the supply chain) while also optimally solving problems where no competitive equilibrium exists, for which the double-auction method frequently produces inefficient solutions. © 2012 Wiley Periodicals, Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of the study was to define common reasons for non-adherence (NA) to highly active antiretroviral therapy (HAART) and the number of reasons reported by non-adherent individuals. A confidential questionnaire was administered to HIV-seropositive patients taking proteinase inhibitor based HAART. Median self-reported adherence was 95% (n = 178, range = 60-100%). The most frequent reasons for at least 'sometimes' missing a dose were eating a meal at the wrong time (38.2%), oversleeping (36.3%), forgetting (35.0%) and being in a social situation (30.5%). The mean number of reasons occurring at least 'sometimes' was 3.2; 20% of patients gave six or more reasons; those reporting the lowest adherence reported a significantly greater numbers of reasons (ρ = - 0.59; p < 0.001). Three factors were derived from the data by principal component analysis reflecting 'negative experiences of HAART', 'having a low priority for taking medication' and 'unintentionally missing doses', accounting for 53.8% of the variance. On multivariate analysis only the latter two factors were significantly related to NA (odds ratios 0.845 and 0.849, respectively). There was a wide spectrum of reasons for NA in our population. The number of reasons in an individual increased as adherence became less. A variety of modalities individualized for each patient are required to support patients with the lowest adherence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accounting profession has come under increased scrutiny over recent years about the growing number of non-audit fees received from audit clients and the possible negative impact of such fees on auditor independence. The argument advanced is that providing substantial amounts of non-audit services to clients may make it more likely that auditors concede to the wishes of the client management when difficult judgments are made. Such concerns are particularly salient in the case of reporting decisions related to going-concern uncertainties for financially stressed clients. This study empirically examines audit reports provided to financially stressed companies in the United Kingdom and the magnitude of audit and non-audit service fees paid to the company’s auditors. We find that the magnitude of both audit fees and non-audit fees are significantly associated with the issuance of a going-concern modified audit opinion. In particular, financially stressed companies with high audit fees are more likely to receive a going-concern modified audit opinion, whereas companies with high non-audit fees are less likely to receive a goingconcern modified audit opinion. Additional analyses indicate that the results are generally robust across alternative model and variable specifications. Overall, evidence supports the contention that high non-audit fees have a detrimental effect on going-concern reporting judgments for financially stressed U.K. companies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Patients with non-erosive reflux disease (NERD) report symptoms which commonly fail to improve on conventional antireflux therapies. Oesophageal visceral hyperalgaesia may contribute to symptom generation in NERD and we explore this hypothesis using oesophageal evoked potentials. Fifteen endoscopically confirmed NERD patients (four female, 29–56 years) plus 15 matched healthy volunteers (four female, 23–56 years) were studied. All patients had oesophageal manometry/24-h pH monitoring and all subjects underwent evoked potential and sensory testing, using electrical stimulation of the distal oesophagus. Cumulatively, NERD patients had higher sensory thresholds and increased evoked potential latencies when compared to controls (P = 0.01). In NERD patients, there was a correlation between pain threshold and acid exposure as determined by DeMeester score (r = 0.63, P = 0.02), with increased oesophageal sensitivity being associated with lower DeMeester score. Reflux negative patients had lower pain thresholds when compared to both reflux positive patients and controls. Evoked potentials were normal in reflux negative patients but significantly delayed in the reflux positive group (P = 0.01). We demonstrate that NERD patients form a continuum of oesophageal afferent sensitivity with a correlation between the degree of acid exposure and oesophageal pain thresholds. We provide objective evidence that increased oesophageal pain sensitivity in reflux negative NERD is associated with heightened afferent sensitivity as normal latency evoked potential responses could be elicited with reduced afferent input. Increased oesophageal afferent pain sensitivity may play an important role in a subset of NERD and could offer an alternate therapeutic target.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cells undergoing apoptosis in vivo are rapidly detected and cleared by phagocytes. Swift recognition and removal of apoptotic cells is important for normal tissue homeostasis and failure in the underlying clearance mechanisms has pathological consequences associated with inflammatory and auto-immune diseases. Cell cultures in vitro usually lack the capacity for removal of non-viable cells because of the absence of phagocytes and, as such, fail to emulate the healthy in vivo micro-environment from which dead cells are absent. While a key objective in cell culture is to maintain viability at maximal levels, cell death is unavoidable and non-viable cells frequently contaminate cultures in significant numbers. Here we show that the presence of apoptotic cells in monoclonal antibody-producing hybridoma cultures has markedly detrimental effects on antibody productivity. Removal of apoptotic hybridoma cells by macrophages at the time of seeding resulted in 100% improved antibody productivity that was, surprisingly to us, most pronounced late on in the cultures. Furthermore, we were able to recapitulate this effect using novel super-paramagnetic Dead-Cert Nanoparticles to remove non-viable cells simply and effectively at culture seeding. These results (1) provide direct evidence that apoptotic cells have a profound influence on their non-phagocytic neighbors in culture and (2) demonstrate the effectiveness of a simple dead-cell removal strategy for improving antibody manufacture in vitro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sixty coagulase-negative staphylococcus (CNS) isolates were recovered from the blood cultures or peritoneal dialysate effluent of 43 patients on renal dialysis. The patients had either renal dialysis catheter-related sepsis (CRS) or continuous ambulatory peritoneal dialysis (CAPD)-associated peritonitis. Isolates were characterized by biotyping, and genotyped by pulsed-field gel electrophoresis (PFGE). Phenotypic properties of the strains were also investigated. Several genotypes were identified with no one specific strain of CNS being associated with CRS. However, closely related strains were isolated from several patients within the units studied, suggesting horizontal transfer of micro-organisms. Genotypic macro-restriction profiles did not concur with phenotypic profiles or biotypes, confirming that genotyping is required for epidemiological studies. All staphylococcal strains were investigated for the production of phenotypic characteristics. Significant differences were predominantly seen in the production of lipase, esterase and elastase in strains isolated from the renal patients with CRS and CAPD-associated peritonitis, compared with a non-septic control group. These phenotypic characteristics may therefore have a role in the maintenance of CRS in renal patients. © 2003 The Hospital Infection Society. Published by Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If in a correlation test, one or both variables are small whole numbers, scores based on a limited scale, or percentages, a non-parametric correlation coefficient should be considered as an alternative to Pearson’s ‘r’. Kendall’s t and Spearman’s rs are similar tests but the former should be considered if the analysis is to be extended to include partial correlations. If the data contain many tied values, then gamma should be considered as a suitable test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we examine the equilibrium states of finite amplitude flow in a horizontal fluid layer with differential heating between the two rigid boundaries. The solutions to the Navier-Stokes equations are obtained by means of a perturbation method for evaluating the Landau constants and through a Newton-Raphson iterative method that results from the Fourier expansion of the solutions that bifurcate above the linear stability threshold of infinitesimal disturbances. The results obtained from these two different methods of evaluating the convective flow are compared in the neighborhood of the critical Rayleigh number. We find that for small Prandtl numbers the discrepancy of the two methods is noticeable. © 2009 The Physical Society of Japan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What is the role of pragmatics in the evolution of grammatical paradigms? It is to maintain marked candidates that may come to be the default expression. This perspective is validated by the Jespersen cycle, where the standard expression of sentential negation is renewed as pragmatically marked negatives achieve default status. How status changes are effected, however, remains to be documented. This is what is achieved in this paper that looks at the evolution of preverbal negative non in Old and Middle French. The negative, which categorically marks pragmatic activation (Dryer 1996) with finite verbs in Old French, loses this value when used with non-finite verbs in Middle French. This process is accompanied by competing semantic reanalyses of the distribution of infinitives negated in this way, and by the co-occurrence with a greater lexical variety of verbs. The absence of pragmatic contribution should lead the marker to take on the role of default, which is already fulfilled by a well-established ne ... pas, pushing non to decline. Hard empirical evidence is thus provided that validates the assumed role of pragmatics in the Jespersen cycle, supporting the general view of pragmatics as supporting alternative candidates that may or may not achieve default status in the evolution of a grammatical paradigm.