591 resultados para ISI


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a simple hybrid computer technique to study the transient behaviour of queueing systems. This method is superior to stand-alone analog or digital solution because the hardware requirement is excessive for analog technique whereas computation time is appreciable in the latter case. By using a hybrid computer one can share the analog hardware thus requiring fewer integrators. The digital processor can store the values, play them back at required time instants and change the coefficients of differential equations. By speeding up the integration on the analog computer it is feasible to solve a large number of these equations very fast. Hybrid simulation is even superior to the analytic technique because in the latter case it is difficult to solve time-varying differential equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of feature selection in a nonparametric unsupervised learning environment is practically undeveloped because no true measure for the effectiveness of a feature exists in such an environment. The lack of a feature selection phase preceding the clustering process seriously affects the reliability of such learning. New concepts such as significant features, level of significance of features, and immediate neighborhood are introduced which result in meeting implicitly the need for feature slection in the context of clustering techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important question which has to be answered in evaluting the suitability of a microcomputer for a control application is the time it would take to execute the specified control algorithm. In this paper, we present a method of obtaining closed-form formulas to estimate this time. These formulas are applicable to control algorithms in which arithmetic operations and matrix manipulations dominate. The method does not require writing detailed programs for implementing the control algorithm. Using this method, the execution times of a variety of control algorithms on a range of 16-bit mini- and recently announced microcomputers are calculated. The formulas have been verified independently by an analysis program, which computes the execution time bounds of control algorithms coded in Pascal when they are run on a specified micro- or minicomputer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel CMOS static RAM cell for ternary logic systems is described. This cell is based on the lambda diode. The operation of the cell has been simulated using the SPICE 2G program. The results of the simulation are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impurity profile for the second oxidation, used in MOST fabrication, has been obtained by Margalit et al. [1]. The disadvantage of this technique is that the accuracy of their solution is directly dependent on the computer time. In this article, an analytical solution is presented using the approximation of linearizing the second oxidation procedure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The availability of a small fleet of aircraft in a flying-base, repair-depot combination is modeled and studied. First, a deterministic flow model relates parameters of interest and represents the state-of-the art in the planning of such systems. Second, a cyclic queue model shows the effect of the principal uncertainties in operation and repair and shows the consequent decrease in the availability of aircraft at the flying-base. Several options such as increasing fleet size, investments in additional repair facilities, or building reliability and maintainability into the individual aircraft during its life-cycle are open for increasing the availability. A life-cycle cost criterion brings out some of these features. Numerical results confirm Rose's prediction that there exists a minimal cost combination of end products and repair-depot capability to achieve a prescribed operational availability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the reliability optimization of a spatially redundant system, subject to various constraints, by using nonlinear programming. The constrained optimization problem is converted into a sequence of unconstrained optimization problems by using a penalty function. The new problem is then solved by the conjugate gradient method. The advantages of this method are highlighted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we first describe a framework to model the sponsored search auction on the web as a mechanism design problem. Using this framework, we describe two well-known mechanisms for sponsored search auction-Generalized Second Price (GSP) and Vickrey-Clarke-Groves (VCG). We then derive a new mechanism for sponsored search auction which we call optimal (OPT) mechanism. The OPT mechanism maximizes the search engine's expected revenue, while achieving Bayesian incentive compatibility and individual rationality of the advertisers. We then undertake a detailed comparative study of the mechanisms GSP, VCG, and OPT. We compute and compare the expected revenue earned by the search engine under the three mechanisms when the advertisers are symmetric and some special conditions are satisfied. We also compare the three mechanisms in terms of incentive compatibility, individual rationality, and computational complexity. Note to Practitioners-The advertiser-supported web site is one of the successful business models in the emerging web landscape. When an Internet user enters a keyword (i.e., a search phrase) into a search engine, the user gets back a page with results, containing the links most relevant to the query and also sponsored links, (also called paid advertisement links). When a sponsored link is clicked, the user is directed to the corresponding advertiser's web page. The advertiser pays the search engine in some appropriate manner for sending the user to its web page. Against every search performed by any user on any keyword, the search engine faces the problem of matching a set of advertisers to the sponsored slots. In addition, the search engine also needs to decide on a price to be charged to each advertiser. Due to increasing demands for Internet advertising space, most search engines currently use auction mechanisms for this purpose. These are called sponsored search auctions. A significant percentage of the revenue of Internet giants such as Google, Yahoo!, MSN, etc., comes from sponsored search auctions. In this paper, we study two auction mechanisms, GSP and VCG, which are quite popular in the sponsored auction context, and pursue the objective of designing a mechanism that is superior to these two mechanisms. In particular, we propose a new mechanism which we call the OPT mechanism. This mechanism maximizes the search engine's expected revenue subject to achieving Bayesian incentive compatibility and individual rationality. Bayesian incentive compatibility guarantees that it is optimal for each advertiser to bid his/her true value provided that all other agents also bid their respective true values. Individual rationality ensures that the agents participate voluntarily in the auction since they are assured of gaining a non-negative payoff by doing so.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An isolated wind power generation scheme using slip ring induction machine (SRIM) is proposed. The proposed scheme maintains constant load voltage and frequency irrespective of the wind speed or load variation. The power circuit consists of two back-to-back connected inverters with a common dc link, where one inverter is directly connected to the rotor side of SRIM and the other inverter is connected to the stator side of the SRIM through LC filter. Developing a negative sequence compensation method to ensure that, even under the presence of unbalanced load, the generator experiences almost balanced three-phase current and most of the unbalanced current is directed through the stator side converter is the focus here. The SRIM controller varies the speed of the generator with variation in the wind speed to extract maximum power. The difference of the generated power and the load power is either stored in or extracted from a battery bank, which is interfaced to the common dc link through a multiphase bidirectional fly-back dc-dc converter. The SRIM control scheme, maximum power point extraction algorithm and the fly-back converter topology are incorporated from available literature. The proposed scheme is both simulated and experimentally verified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The estimated likelihood of lower limb amputation is 10 to 30 times higher amongst people with diabetes compared to those without diabetes. Of all non-traumatic amputations in people with diabetes, 85% are preceded by a foot ulcer. Foot ulceration associated with diabetes (diabetic foot ulcers) is caused by the interplay of several factors, most notably diabetic peripheral neuropathy (DPN), peripheral arterial disease (PAD) and changes in foot structure. These factors have been linked to chronic hyperglycaemia (high levels of glucose in the blood) and the altered metabolic state of diabetes. Control of hyperglycaemia may be important in the healing of ulcers. Objectives To assess the effects of intensive glycaemic control compared to conventional control on the outcome of foot ulcers in people with type 1 and type 2 diabetes. Search methods In December 2015 we searched: The Cochrane Wounds Specialised Register; The Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library); Ovid MEDLINE; Ovid MEDLINE (In-Process & Other Non-Indexed Citations); Ovid EMBASE; EBSCO CINAHL; Elsevier SCOPUS; ISI Web of Knowledge Web of Science; BioMed Central and LILACS. We also searched clinical trial databases, pharmaceutical trial databases and current international and national clinical guidelines on diabetes foot management for relevant published, non-published, ongoing and terminated clinical trials. There were no restrictions based on language or date of publication or study setting. Selection criteria Published, unpublished and ongoing randomised controlled trials (RCTs) were considered for inclusion where they investigated the effects of intensive glycaemic control on the outcome of active foot ulcers in people with diabetes. Non randomised and quasi-randomised trials were excluded. In order to be included the trial had to have: 1) attempted to maintain or control blood glucose levels and measured changes in markers of glycaemic control (HbA1c or fasting, random, mean, home capillary or urine glucose), and 2) documented the effect of these interventions on active foot ulcer outcomes. Glycaemic interventions included subcutaneous insulin administration, continuous insulin infusion, oral anti-diabetes agents, lifestyle interventions or a combination of these interventions. The definition of the interventional (intensive) group was that it should have a lower glycaemic target than the comparison (conventional) group. Data collection and analysis All review authors independently evaluated the papers identified by the search strategy against the inclusion criteria. Two review authors then independently reviewed all potential full-text articles and trials registry results for inclusion. Main results We only identified one trial that met the inclusion criteria but this trial did not have any results so we could not perform the planned subgroup and sensitivity analyses in the absence of data. Two ongoing trials were identified which may provide data for analyses in a later version of this review. The completion date of these trials is currently unknown. Authors' conclusions The current review failed to find any completed randomised clinical trials with results. Therefore we are unable to conclude whether intensive glycaemic control when compared to conventional glycaemic control has a positive or detrimental effect on the treatment of foot ulcers in people with diabetes. Previous evidence has however highlighted a reduction in risk of limb amputation (from various causes) in people with type 2 diabetes with intensive glycaemic control. Whether this applies to people with foot ulcers in particular is unknown. The exact role that intensive glycaemic control has in treating foot ulcers in multidisciplinary care (alongside other interventions targeted at treating foot ulcers) requires further investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with low maximum-likelihood (ML)-decoding complexity, full-rate and full-diversity space-time block codes (STBCs), which also offer large coding gain, for the 2 transmit antenna, 2 receive antenna (2 x 2) and the 4 transmit antenna, 2 receive antenna (4 x 2) MIMO systems. Presently, the best known STBC for the 2 2 system is the Golden code and that for the 4 x 2 system is the DjABBA code. Following the approach by Biglieri, Hong, and Viterbo, a new STBC is presented in this paper for the 2 x 2 system. This code matches the Golden code in performance and ML-decoding complexity for square QAM constellations while it has lower ML-decoding complexity with the same performance for non-rectangular QAM constellations. This code is also shown to be information-lossless and diversity-multiplexing gain (DMG) tradeoff optimal. This design procedure is then extended to the 4 x 2 system and a code, which outperforms the DjABBA code for QAM constellations with lower ML-decoding complexity, is presented. So far, the Golden code has been reported to have an ML-decoding complexity of the order of for square QAM of size. In this paper, a scheme that reduces its ML-decoding complexity to M-2 root M is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a low-complexity algorithm for detection in high-rate, non-orthogonal space-time block coded (STBC) large-multiple-input multiple-output (MIMO) systems that achieve high spectral efficiencies of the order of tens of bps/Hz. We also present a training-based iterative detection/channel estimation scheme for such large STBC MIMO systems. Our simulation results show that excellent bit error rate and nearness-to-capacity performance are achieved by the proposed multistage likelihood ascent search (M-LAS) detector in conjunction with the proposed iterative detection/channel estimation scheme at low complexities. The fact that we could show such good results for large STBCs like 16 X 16 and 32 X 32 STBCs from Cyclic Division Algebras (CDA) operating at spectral efficiencies in excess of 20 bps/Hz (even after accounting for the overheads meant for pilot based training for channel estimation and turbo coding) establishes the effectiveness of the proposed detector and channel estimator. We decode perfect codes of large dimensions using the proposed detector. With the feasibility of such a low-complexity detection/channel estimation scheme, large-MIMO systems with tens of antennas operating at several tens of bps/Hz spectral efficiencies can become practical, enabling interesting high data rate wireless applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Biological systems present remarkable adaptation, reliability, and robustness in various environments, even under hostility. Most of them are controlled by the individuals in a distributed and self-organized way. These biological mechanisms provide useful resources for designing the dynamical and adaptive routing schemes of wireless mobile sensor networks, in which the individual nodes should ideally operate without central control. This paper investigates crucial biologically inspired mechanisms and the associated techniques for resolving routing in wireless sensor networks, including Ant-based and genetic approaches. Furthermore, the principal contributions of this paper are as follows. We present a mathematical theory of the biological computations in the context of sensor networks; we further present a generalized routing framework in sensor networks by diffusing different modes of biological computations using Ant-based and genetic approaches; finally, an overview of several emerging research directions are addressed within the new biologically computational framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we exploit the idea of decomposition to match buyers and sellers in an electronic exchange for trading large volumes of homogeneous goods, where the buyers and sellers specify marginal-decreasing piecewise constant price curves to capture volume discounts. Such exchanges are relevant for automated trading in many e-business applications. The problem of determining winners and Vickrey prices in such exchanges is known to have a worst-case complexity equal to that of as many as (1 + m + n) NP-hard problems, where m is the number of buyers and n is the number of sellers. Our method proposes the overall exchange problem to be solved as two separate and simpler problems: 1) forward auction and 2) reverse auction, which turns out to be generalized knapsack problems. In the proposed approach, we first determine the quantity of units to be traded between the sellers and the buyers using fast heuristics developed by us. Next, we solve a forward auction and a reverse auction using fully polynomial time approximation schemes available in the literature. The proposed approach has worst-case polynomial time complexity. and our experimentation shows that the approach produces good quality solutions to the problem. Note to Practitioners- In recent times, electronic marketplaces have provided an efficient way for businesses and consumers to trade goods and services. The use of innovative mechanisms and algorithms has made it possible to improve the efficiency of electronic marketplaces by enabling optimization of revenues for the marketplace and of utilities for the buyers and sellers. In this paper, we look at single-item, multiunit electronic exchanges. These are electronic marketplaces where buyers submit bids and sellers ask for multiple units of a single item. We allow buyers and sellers to specify volume discounts using suitable functions. Such exchanges are relevant for high-volume business-to-business trading of standard products, such as silicon wafers, very large-scale integrated chips, desktops, telecommunications equipment, commoditized goods, etc. The problem of determining winners and prices in such exchanges is known to involve solving many NP-hard problems. Our paper exploits the familiar idea of decomposition, uses certain algorithms from the literature, and develops two fast heuristics to solve the problem in a near optimal way in worst-case polynomial time.