28 resultados para Maximizing
Resumo:
Quality of services (QoS) support is critical for dedicated short range communications (DSRC) vehicle networks based collaborative road safety applications. In this paper we propose an adaptive power and message rate control method for DSRC vehicle networks at road intersections. The design objective is to provide high availability and low latency channels for high priority emergency safety applications while maximizing channel utilization for low priority routine safety applications. In this method an offline simulation based approach is used to find out the best possible configurations of transmit power and message rate for given numbers of vehicles in the network. The identified best configurations are then used online by roadside access points (AP) according to estimated number of vehicles. Simulation results show that this adaptive method significantly outperforms a fixed control method. © 2011 Springer-Verlag.
Resumo:
This paper analyzes a communication network facing users with a continuous distribution of delay cost per unit time. Priority queueing is often used as a way to provide differential services for users with different delay sensitivities. Delay is a key dimension of network service quality, so priority is a valuable resource which is limited and should to be optimally allocated. We investigate the allocation of priority in queues via a simple bidding mechanism. In our mechanism, arriving users can decide not to enter the network at all or submit an announced delay sensitive value. User entering the network obtains priority over all users who make lower bids, and is charged by a payment function which is designed following an exclusion compensation principle. The payment function is proved to be incentive compatible, so the equilibrium bidding behavior leads to the implementation of "cµ-rule". Social warfare or revenue maximizing by appropriately setting the reserve payment is also analyzed.
Resumo:
The existing assignment problems for assigning n jobs to n individuals are limited to the considerations of cost or profit measured as crisp. However, in many real applications, costs are not deterministic numbers. This paper develops a procedure based on Data Envelopment Analysis method to solve the assignment problems with fuzzy costs or fuzzy profits for each possible assignment. It aims to obtain the points with maximum membership values for the fuzzy parameters while maximizing the profit or minimizing the assignment cost. In this method, a discrete approach is presented to rank the fuzzy numbers first. Then, corresponding to each fuzzy number, we introduce a crisp number using the efficiency concept. A numerical example is used to illustrate the usefulness of this new method. © 2012 Operational Research Society Ltd. All rights reserved.
Resumo:
Background: The prevalence of hearing loss is considerably higher in individuals in residential care than in people within the community-dwelling population, and yet hearing aids and hearing services are relatively underused. Care staff have a key role in supporting access to services. Objectives: This study identifies staff perspectives on hearing loss and their views about potential hearing service improvements. Study design: A four-stage mixed methods study was used, made up of qualitative interviews, observation, a survey and a stakeholder involvement meeting. Results: The qualitative stages indicated that staff were concerned about their levels of interaction with residents. Staff considered maximizing communication as part of their professional role. The quantitative survey indicated that these views were widely held by staff, and the stakeholder stage identified the need for social support and dedicated staff training opportunities. Conclusion: Care home staff regard communication as a shared issue. Future interventions could enhance access to hearing services and provide care home staff with training in hearing loss and hearing aid management. © 2013 Informa Healthcare.
Resumo:
We have investigated how optimal coding for neural systems changes with the time available for decoding. Optimization was in terms of maximizing information transmission. We have estimated the parameters for Poisson neurons that optimize Shannon transinformation with the assumption of rate coding. We observed a hierarchy of phase transitions from binary coding, for small decoding times, toward discrete (M-ary) coding with two, three and more quantization levels for larger decoding times. We postulate that the presence of subpopulations with specific neural characteristics could be a signiture of an optimal population coding scheme and we use the mammalian auditory system as an example.
Resumo:
In the contemporary customer-driven supply chain, maximization of customer service plays an equally important role as minimization of costs for a company to retain and increase its competitiveness. This article develops a multiple-criteria optimization approach, combining the analytic hierarchy process (AHP) and an integer linear programming (ILP) model, to aid the design of an optimal logistics distribution network. The proposed approach outperforms traditional cost-based optimization techniques because it considers both quantitative and qualitative factors and also aims at maximizing the benefits of deliverer and customers. In the approach, the AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to some critical customer-oriented criteria. The results of AHP prioritization are utilized as the input of the ILP model, the objective of which is to select the best warehouses at the lowest possible cost. In this article, two commercial packages are used: including Expert Choice and LINDO.
Resumo:
Assessment criteria are increasingly incorporated into teaching, making it important to clarify the pedagogic status of the qualities to which they refer. We reviewed theory and evidence about the extent to which four core criteria for student writing-critical thinking, use of language, structuring, and argument-refer to the outcomes of three types of learning: generic skills learning, a deep approach to learning, and complex learning. The analysis showed that all four of the core criteria describe to some extent properties of text resulting from using skills, but none qualify fully as descriptions of the outcomes of applying generic skills. Most also describe certain aspects of the outcomes of taking a deep approach to learning. Critical thinking and argument correspond most closely to the outcomes of complex learning. At lower levels of performance, use of language and structuring describe the outcomes of applying transferable skills. At higher levels of performance, they describe the outcomes of taking a deep approach to learning. We propose that the type of learning required to meet the core criteria is most usefully and accurately conceptualized as the learning of complex skills, and that this provides a conceptual framework for maximizing the benefits of using assessment criteria as part of teaching. © 2006 Taylor & Francis.
Resumo:
Cell population heterogeneity has attracted great interest for understanding the individual cellular performances in their response to external stimuli and in the production of targeted products. Physical characterization of single cells and analysis of dynamic gene expression, synthesized proteins, and cellular metabolites from one single cell are reviewed. Advanced techniques have been developed to achieve high-throughput and ultrahigh resolution or sensitivity. Single cell capture methods are discussed as well. How to make use of cellular heterogeneities for maximizing cellular productivity is still in the infant stage, and control strategies will be formulated after the causes for heterogeneity have been elucidated.
Resumo:
Metrology processes contribute to entire manufacturing systems that can have a considerable impact on financial investment in coordinate measuring systems. However, there is a lack of generic methodologies to quantify their economical value in today’s industry. To solve this problem, a mathematical model is proposed in this paper by statistical deductive reasoning. This is done through defining the relationships between Process Capability Index, measurement uncertainty and tolerance band. The correctness of the mathematical model is proved by a case study. Finally, several comments and suggestions on evaluating and maximizing the benefits of metrology investment are given.
Resumo:
The quest for sustainable resources to meet the demands of a rapidly rising global population while mitigating the risks of rising CO2 emissions and associated climate change, represents a grand challenge for humanity. Biomass offers the most readily implemented and low-cost solution for sustainable transportation fuels, and the only non-petroleum route to organic molecules for the manufacture of bulk, fine and speciality chemicals and polymers. To be considered truly sustainable, biomass must be derived fromresources which do not compete with agricultural land use for food production, or compromise the environment (e.g. via deforestation). Potential feedstocks include waste lignocellulosic or oil-based materials derived from plant or aquatic sources, with the so-called biorefinery concept offering the co-production of biofuels, platform chemicals and energy; analogous to today's petroleum refineries which deliver both high-volume/low-value (e.g. fuels and commodity chemicals) and lowvolume/ high-value (e.g. fine/speciality chemicals) products, thereby maximizing biomass valorization. This article addresses the challenges to catalytic biomass processing and highlights recent successes in the rational design of heterogeneous catalysts facilitated by advances in nanotechnology and the synthesis of templated porous materials, as well as the use of tailored catalyst surfaces to generate bifunctional solid acid/base materials or tune hydrophobicity.
Resumo:
Photovoltaic (PV) solar power generation is proven to be effective and sustainable but is currently hampered by relatively high costs and low conversion efficiency. This paper addresses both issues by presenting a low-cost and efficient temperature distribution analysis for identifying PV module mismatch faults by thermography. Mismatch faults reduce the power output and cause potential damage to PV cells. This paper first defines three fault categories in terms of fault levels, which lead to different terminal characteristics of the PV modules. The investigation of three faults is also conducted analytically and experimentally, and maintenance suggestions are also provided for different fault types. The proposed methodology is developed to combine the electrical and thermal characteristics of PV cells subjected to different fault mechanisms through simulation and experimental tests. Furthermore, the fault diagnosis method can be incorporated into the maximum power point tracking schemes to shift the operating point of the PV string. The developed technology has improved over the existing ones in locating the faulty cell by a thermal camera, providing a remedial measure, and maximizing the power output under faulty conditions.
Resumo:
Principal component analysis (PCA) is well recognized in dimensionality reduction, and kernel PCA (KPCA) has also been proposed in statistical data analysis. However, KPCA fails to detect the nonlinear structure of data well when outliers exist. To reduce this problem, this paper presents a novel algorithm, named iterative robust KPCA (IRKPCA). IRKPCA works well in dealing with outliers, and can be carried out in an iterative manner, which makes it suitable to process incremental input data. As in the traditional robust PCA (RPCA), a binary field is employed for characterizing the outlier process, and the optimization problem is formulated as maximizing marginal distribution of a Gibbs distribution. In this paper, this optimization problem is solved by stochastic gradient descent techniques. In IRKPCA, the outlier process is in a high-dimensional feature space, and therefore kernel trick is used. IRKPCA can be regarded as a kernelized version of RPCA and a robust form of kernel Hebbian algorithm. Experimental results on synthetic data demonstrate the effectiveness of IRKPCA. © 2010 Taylor & Francis.
Resumo:
The frequency, time and places of charging have large impact on the Quality of Experience (QoE) of EV drivers. It is critical to design effective EV charging scheduling system to improve the QoE of EV drivers. In order to improve EV charging QoE and utilization of CSs, we develop an innovative travel plan aware charging scheduling scheme for moving EVs to be charged at Charging Stations (CS). In the design of the proposed charging scheduling scheme for moving EVs, the travel routes of EVs and the utility of CSs are taken into consideration. The assignment of EVs to CSs is modeled as a two-sided many-to-one matching game with the objective of maximizing the system utility which reflects the satisfactory degrees of EVs and the profits of CSs. A Stable Matching Algorithm (SMA) is proposed to seek stable matching between charging EVs and CSs. Furthermore, an improved Learning based On-LiNe scheduling Algorithm (LONA) is proposed to be executed by each CS in a distributed manner. The performance gain of the average system utility by the SMA is up to 38.2% comparing to the Random Charging Scheduling (RCS) algorithm, and 4.67% comparing to Only utility of Electric Vehicle Concerned (OEVC) scheme. The effectiveness of the proposed SMA and LONA is also demonstrated by simulations in terms of the satisfactory ratio of charging EVs and the the convergence speed of iteration.