933 resultados para Principle of reason


Relevância:

80.00% 80.00%

Publicador:

Resumo:

When the orthogonal space-time block code (STBC), or the Alamouti code, is applied on a multiple-input multiple-output (MIMO) communications system, the optimum reception can be achieved by a simple signal decoupling at the receiver. The performance, however, deteriorates significantly in presence of co-channel interference (CCI) from other users. In this paper, such CCI problem is overcome by applying the independent component analysis (ICA), a blind source separation algorithm. This is based on the fact that, if the transmission data from every transmit antenna are mutually independent, they can be effectively separated at the receiver with the principle of the blind source separation. Then equivalently, the CCI is suppressed. Although they are not required by the ICA algorithm itself, a small number of training data are necessary to eliminate the phase and order ambiguities at the ICA outputs, leading to a semi-blind approach. Numerical simulation is also shown to verify the proposed ICA approach in the multiuser MIMO system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most research on distributed space time block coding (STBC) has so far focused on the case of 2 relay nodes and assumed that the relay nodes are perfectly synchronised at the symbol level. By applying STBC to 3-or 4-relay node systems, this paper shows that imperfect synchronisation causes significant performance degradation to the conventional detector. To this end, we propose a new STBC detection solution based on the principle of parallel interference cancellation (PIC). The PIC detector is moderate in computational complexity but is very effective in suppressing the impact of imperfect synchronisation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Video, Dur: 7 min, HD DVC Pro 2009 Based on the principle of assembling a series of improvised acts, the performance is driven by a concern for the image, sound and gesture and the staging of both contemplative and active human presences. Featuring a woman and child duo the video performs the élan vital within a fairytale scenery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A bit-level processing (BLP) based linear CDMA detector is derived following the principle of minimum variance distortionless response (MVDR). The combining taps for the MVDR detector are determined from (1) the covariance matrix of the matched filter output, and (2) the corresponding row (or column) of the user correlation matrix. Due to the interference suppression capability of MVDR and the fact that no inversion of the user correlation matrix is involved, the influence of the synchronisation errors is greatly reduced. The detector performance is demonstrated via computer simulations (both synchronisation errors and intercell interference are considered).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A military operation is about to take place during an ongoing international armed conflict; it can be carried out either by aerial attack, which is expected to cause the deaths of enemy civilians, or by using ground troops, which is expected to cause the deaths of fewer enemy civilians but is expected to result in more deaths of compatriot soldiers. Does the principle of proportionality in international humanitarian law impose a duty on an attacker to expose its soldiers to life-threatening risks in order to minimise or avert risks of incidental damage to enemy civilians? If such a duty exists, is it absolute or qualified? And if it is a qualified duty, what considerations may be taken into account in determining its character and scope? This article presents an analytic framework under the current international humanitarian law (IHL) legal structure, following a proportionality analysis. The proposed framework identifies five main positions for addressing the above queries. The five positions are arranged along two ‘axes’: a value ‘axis’, which identifies the value assigned to the lives of compatriot soldiers in relation to lives of enemy civilians; and a justification ‘axis’, which outlines the justificatory bases for assigning certain values to lives of compatriot soldiers and enemy civilians: intrinsic, instrumental or a combination thereof. The article critically assesses these positions, and favours a position which attributes a value to compatriot soldiers’ lives, premised on a justificatory basis which marries intrinsic considerations with circumscribed instrumental considerations, avoiding the indeterminacy and normative questionability entailed by more expansive instrumental considerations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper develops an account of the normative basis of priority setting in health care as combining the values which a given society holds for the common good of its members, with the universal provided by a principle of common humanity. We discuss national differences in health basket in Europe and argue that health care decision-making in complex social and moral frameworks is best thought of as anchored in such a principle by drawing on the philosophy of need. We show that health care needs are ethically ‘thick’ needs whose psychological and social construction can best be understood in terms of David Wiggins's notion of vital need: a person's need is vital when failure to meet it leads to their harm and suffering. The moral dimension of priority setting which operates across different societies’ health care systems is located in the demands both of and on any society to avoid harm to its members.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Climate change is leading to the development of land-based mitigation and adaptation strategies that are likely to have substantial impacts on global biodiversity. Of these, approaches to maintain carbon within existing natural ecosystems could have particularly large benefits for biodiversity. However, the geographical distributions of terrestrial carbon stocks and biodiversity differ. Using conservation planning analyses for the New World and Britain, we conclude that a carbon-only strategy would not be effective at conserving biodiversity, as have previous studies. Nonetheless, we find that a combined carbon-biodiversity strategy could simultaneously protect 90% of carbon stocks (relative to a carbon-only conservation strategy) and > 90% of the biodiversity (relative to a biodiversity-only strategy) in both regions. This combined approach encapsulates the principle of complementarity, whereby locations that contain different sets of species are prioritised, and hence disproportionately safeguard localised species that are not protected effectively by carbon-only strategies. It is efficient because localised species are concentrated into small parts of the terrestrial land surface, whereas carbon is somewhat more evenly distributed; and carbon stocks protected in one location are equivalent to those protected elsewhere. Efficient compromises can only be achieved when biodiversity and carbon are incorporated together within a spatial planning process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider the problem of constructing balance dynamics for rapidly rotating fluid systems. It is argued that the conventional Rossby number expansion—namely expanding all variables in a series in Rossby number—is secular for all but the simplest flows. In particular, the higher-order terms in the expansion grow exponentially on average, and for moderate values of the Rossby number the expansion is, at best, useful only for times of the order of the doubling times of the instabilities of the underlying quasi-geostrophic dynamics. Similar arguments apply in a wide class of problems involving a small parameter and sufficiently complex zeroth-order dynamics. A modified procedure is proposed which involves expanding only the fast modes of the system; this is equivalent to an asymptotic approximation of the slaving relation that relates the fast modes to the slow modes. The procedure is systematic and thus capable, at least in principle, of being carried to any order—unlike procedures based on truncations. We apply the procedure to construct higher-order balance approximations of the shallow-water equations. At the lowest order quasi-geostrophy emerges. At the next order the system incorporates gradient-wind balance, although the balance relations themselves involve only linear inversions and hence are easily applied. There is a large class of reduced systems associated with various choices for the slow variables, but the simplest ones appear to be those based on potential vorticity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hotelling's (1929) principle of minimum differentiation and the alternative prediction that firms will maximally differentiate from their rivals in order to relax price competition have not been explicitly tested so far. We report results from experimental spatial duopolies designed to address this issue. The levels of product differentiation observed are systematically lower than predicted in equilibrium under risk neutrality and compatible with risk aversion. The observed prices are consistent with collusion attempts. Our main findings are robust to variations in three experimental conditions: automated vs. human market sharing rule for ties, individual vs. collective decision making, and even vs. odd number of locations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Paraconsistent logics are non-classical logics which allow non-trivial and consistent reasoning about inconsistent axioms. They have been pro- posed as a formal basis for handling inconsistent data, as commonly arise in human enterprises, and as methods for fuzzy reasoning, with applica- tions in Artificial Intelligence and the control of complex systems. Formalisations of paraconsistent logics usually require heroic mathe- matical efforts to provide a consistent axiomatisation of an inconsistent system. Here we use transreal arithmetic, which is known to be consis- tent, to arithmetise a paraconsistent logic. This is theoretically simple and should lead to efficient computer implementations. We introduce the metalogical principle of monotonicity which is a very simple way of making logics paraconsistent. Our logic has dialetheaic truth values which are both False and True. It allows contradictory propositions, allows variable contradictions, but blocks literal contradictions. Thus literal reasoning, in this logic, forms an on-the- y, syntactic partition of the propositions into internally consistent sets. We show how the set of all paraconsistent, possible worlds can be represented in a transreal space. During the development of our logic we discuss how other paraconsistent logics could be arithmetised in transreal arithmetic.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

According to the principle of copyright exhaustion, once a copy of a work is placed on the market, the right holder’s control over further distribution of that copy is exhausted. Unlike the distribution of hard copies of copyright works, however, the electronic dissemination of content is not subject to the exhaustion principle. This means that second-hand markets of digital goods cannot exist. Traditionally, exhaustion is premised on four assumptions that cannot be safely assumed in the online context: it applies to tangible copies only; it covers goods and not services; the goods should be sold but not licensed; and the property entitlement should be alienated upon transfer. After long jurisprudential silence, courts at worldwide level have revisited these normative impediments to affirm that exhaustion can apply online in specific instances. The article discusses the doctrinal norms that underpin exhaustion and determines the conditions under which online copyright exhaustion can apply.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The old scholastic principle of the "convertibility" of being and goodness strikes nearly all moderns as either barely comprehensible or plain false. "Convertible" is a term of art meaning "interchangeable" in respect of predication, where the predicates can be exchanged salva veritate albeit not salva sensu: their referents are, as the maxim goes, really the same albeit conceptually different. The principle seems, at first blush, absurd. Did the scholastics literally mean that every being is good? Is that supposed to include a cancer, a malaria parasite, an earthquake that kills millions? If every being is good, then no being is bad—but how can that be? To the contemporary philosophical mind, such bafflement is understandable. It derives from the systematic dismantling of the great scholastic edifice that took place over half a millennium. With the loss of the basic concepts out of which that edifice was built, the space created by those concepts faded out of existence as well. The convertibility principle, like virtually all the other scholastic principles (not all, since some do survive and thrive in analytic philosophy), could not persist in a post-scholastic space wholly alien to it.