944 resultados para Non-relativistic scattering theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’objectiu d’aquest projecte que consisteix a elaborar un algoritme d’optimització que permeti, mitjançant un ajust de dades per mínims quadrats, la extracció dels paràmetres del circuit equivalent que composen el model teòric d’un ressonador FBAR, a partir de les mesures dels paràmetres S. Per a dur a terme aquest treball, es desenvolupa en primer lloc tota la teoria necessària de ressonadors FBAR. Començant pel funcionament i l’estructura, i mostrant especial interès en el modelat d’aquests ressonadors mitjançant els models de Mason, Butterworth Van-Dyke i BVD Modificat. En segon terme, s’estudia la teoria sobre optimització i programació No-Lineal. Un cop s’ha exposat la teoria, es procedeix a la descripció de l’algoritme implementat. Aquest algoritme utilitza una estratègia de múltiples passos que agilitzen l'extracció dels paràmetres del ressonador.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first main result of the paper is a criterion for a partially commutative group G to be a domain. It allows us to reduce the study of algebraic sets over G to the study of irreducible algebraic sets, and reduce the elementary theory of G (of a coordinate group over G) to the elementary theories of the direct factors of G (to the elementary theory of coordinate groups of irreducible algebraic sets). Then we establish normal forms for quantifier-free formulas over a non-abelian directly indecomposable partially commutative group H. Analogously to the case of free groups, we introduce the notion of a generalised equation and prove that the positive theory of H has quantifier elimination and that arbitrary first-order formulas lift from H to H * F, where F is a free group of finite rank. As a consequence, the positive theory of an arbitrary partially commutative group is decidable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an elementary theory of wars fought by fully rational contenders. Two parties play a Markov game that combines stages of bargaining with stages where one side has the ability to impose surrender on the other. Under uncertainty and incomplete information, in the unique equilibrium of the game, long confrontations occur: war arises when reality disappoints initial (rational) optimism, and it persist longer when both agents are optimists but reality proves both wrong. Bargaining proposals that are rejected initially might eventually be accepted after several periods of confrontation. We provide an explicit computation of the equilibrium, evaluating the probability of war, and its expected losses as a function of i) the costs of confrontation, ii) the asymmetry of the split imposed under surrender, and iii) the strengths of contenders at attack and defense. Changes in these parameters display non-monotonic effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a real data set of claims amounts where costs related to damage are recorded separately from those related to medical expenses. Only claims with positive costs are considered here. Two approaches to density estimation are presented: a classical parametric and a semi-parametric method, based on transformation kernel density estimation. We explore the data set with standard univariate methods. We also propose ways to select the bandwidth and transformation parameters in the univariate case based on Bayesian methods. We indicate how to compare the results of alternative methods both looking at the shape of the overall density domain and exploring the density estimates in the right tail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we present a new approach of Nekhoroshev theory for a generic unperturbed Hamiltonian which completely avoids small divisors problems. The proof is an extension of a method introduced by P. Lochak which combines averaging along periodic orbits with simultaneous Diophantine approximation and uses geometric arguments designed by the second author to handle generic integrable Hamiltonians. This method allows to deal with generic non-analytic Hamiltonians and to obtain new results of generic stability around linearly stable tori.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concerns on the clustering of retail industries and professional services in main streets had traditionally been the public interest rationale for supporting distance regulations. Although many geographic restrictions have been suppressed, deregulation has hinged mostly upon the theory results on the natural tendency of outlets to differentiate spatially. Empirical evidence has so far offered mixed results. Using the case of deregulation of pharmacy establishment in a region of Spain, we empirically show how pharmacy locations scatter, and that there is not rationale for distance regulation apart from the underlying private interest of very few incumbents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le cadre d'une étude rétrospective au sein d'une unité de réhabilitation, nous avons cherché à examiner le degré de respect de recommandations de pratique clinique (RPC) abordant le traitement pharmacologique au long cours de la schizophrénie, par des médecins qui n'en ont qu'une connaissance indirecte. The Expert Consensus Guideline for the treatment of schizophrenia (ECGTS) a été retenu comme référence sur la base d'une comparaison avec cinq autres RPC principales. Sur un collectif de 20 patients, les recommandations de l'ECGTS sont totalement respectées dans 65 % des cas, partiellement respectées dans 10 % et non respectées dans 25 %, démontrant ainsi que la pratique clinique est clairement perfectible (principalement dans le traitement des symptômes psychotiques et dépressifs). Cependant, le respect des RPC ne garantit pas forcément la résolution de tous les problèmes cliniques rencontrés : 12 patients sur 20 présentent des effets secondaires à l'évaluation clinique et pour huit d'entre eux, les recommandations à ce niveau, sont respectées. Notre étude montre cependant que le choix et l'application d'une RPC ne sont pas simples. Les RPC actuelles donnent peu ou pas d'instruments de mesure, ni de critères précis pour évaluer les problèmes cliniques auxquels elles font référence. L'avenir appartient donc à des RPC qui proposent, outre les recommandations cliniques elles-mêmes, les moyens de leur vérification et de leur application sur le terrain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. In autism and schizophrenia attenuated/atypical functional hemispheric asymmetry and theory of mind impairments have been reported, suggesting common underlying neuroscientific correlates. We here investigated whether impaired theory of mind performance is associated with attenuated/atypical hemispheric asymmetry. An association may explain the co-occurrence of both dysfunctions in psychiatric populations. Methods. Healthy participants (n 129) performed a left hemisphere (lateralised lexical decision task) and right hemisphere (lateralised face decision task) dominant task as well as a visual cartoon task to assess theory of mind performance. Results. Linear regression analyses revealed inconsistent associations between theory of mind performance and functional hemisphere asymmetry: enhanced theory of mind performance was only associated with (1) faster right hemisphere language processing, and (2) reduced right hemisphere dominance for face processing (men only). Conclusions. The majority of non-significant findings suggest that theory of mind and functional hemispheric asymmetry are unrelated. Instead of ''overinterpreting'' the two significant results, discrepancies in the previous literature relating to the problem of the theory of mind concept, the variety of tasks, and the lack of normative data are discussed. We also suggest how future studies could explore a possible link between hemispheric asymmetry and theory of mind.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. [Authors]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particle physics studies highly complex processes which cannot be directly observed. Scientific realism claims that we are nevertheless warranted in believing that these processes really occur and that the objects involved in them really exist. This dissertation defends a version of scientific realism, called causal realism, in the context of particle physics. I start by introducing the central theses and arguments in the recent philosophical debate on scientific realism (chapter 1), with a special focus on an important presupposition of the debate, namely common sense realism. Chapter 2 then discusses entity realism, which introduces a crucial element into the debate by emphasizing the importance of experiments in defending scientific realism. Most of the chapter is concerned with Ian Hacking's position, but I also argue that Nancy Cartwright's version of entity realism is ultimately preferable as a basis for further development. In chapter 3,1 take a step back and consider the question whether the realism debate is worth pursuing at all. Arthur Fine has given a negative answer to that question, proposing his natural ontologica! attitude as an alternative to both realism and antirealism. I argue that the debate (in particular the realist side of it) is in fact less vicious than Fine presents it. The second part of my work (chapters 4-6) develops, illustrates and defends causal realism. The key idea is that inference to the best explanation is reliable in some cases, but not in others. Chapter 4 characterizes the difference between these two kinds of cases in terms of three criteria which distinguish causal from theoretical warrant. In order to flesh out this distinction, chapter 5 then applies it to a concrete case from the history of particle physics, the discovery of the neutrino. This case study shows that the distinction between causal and theoretical warrant is crucial for understanding what it means to "directly detect" a new particle. But the distinction is also an effective tool against what I take to be the presently most powerful objection to scientific realism: Kyle Stanford's argument from unconceived alternatives. I respond to this argument in chapter 6, and I illustrate my response with a discussion of Jean Perrin's experimental work concerning the atomic hypothesis. In the final part of the dissertation, I turn to the specific challenges posed to realism by quantum theories. One of these challenges comes from the experimental violations of Bell's inequalities, which indicate a failure of locality in the quantum domain. I show in chapter 7 how causal realism can further our understanding of quantum non-locality by taking account of some recent experimental results. Another challenge to realism in quantum mechanics comes from delayed-choice experiments, which seem to imply that certain aspects of what happens in an experiment can be influenced by later choices of the experimenter. Chapter 8 analyzes these experiments and argues that they do not warrant the antirealist conclusions which some commentators draw from them. It pays particular attention to the case of delayed-choice entanglement swapping and the corresponding question whether entanglement is a real physical relation. In chapter 9,1 finally address relativistic quantum theories. It is often claimed that these theories are incompatible with a particle ontology, and this calls into question causal realism's commitment to localizable and countable entities. I defend the commitments of causal realism against these objections, and I conclude with some remarks connecting the interpretation of quantum field theory to more general metaphysical issues confronting causal realism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a simple and general model for computing the Ramsey optimal inflation tax, which includes several models from the previous literature as special cases. We show that it cannot be claimed that the Friedman rule is always optimal (or always non--optimal) on theoretical grounds. The Friedman rule is optimal or not, depending on conditions related to the shape of various relevant functions. One contribution of this paper is to relate these conditions to {\it measurable} variables such as the interest rate or the consumption elasticity of money demand. We find that it tends to be optimal to tax money when there are economies of scale in the demand for money (the scale elasticity is smaller than one) and/or when money is required for the payment of consumption or wage taxes. We find that it tends to be optimal to tax money more heavily when the interest elasticity of money demand is small. We present empirical evidence on the parameters that determine the optimal inflation tax. Calibrating the model to a variety of empirical studies yields a optimal nominal interest rate of less than 1\%/year, although that finding is sensitive to the calibration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider an economy where the production technology has constantreturns to scale but where in the descentralized equilibrium thereare aggregate increasing returns to scale. The result follows froma positive contracting externality among firms. If a firms issurrounded by more firms, employees have more opportunitiesoutside their own firm. This improves employees' incentives toinvest in the presence of ex post renegotiation at the firm level,at not cost. Our leading result is that if a region is sparselypopulated or if the degree of development in the region is lowenough, there are multiple equilibria in the level of sectorialemployment. From the theoretical model we derive a non-linearfirst-order censored difference equation for sectoral employment.Our results are strongly consistent with the multiple equilibriahypothesis and the existence of a sectoral critical scale (belowwich the sector follows a delocation process). The scale of theregions' population and the degree of development reduce thecritical scale of the sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neural signatures of humans' movement intention can be exploited by future neuroprosthesis. We propose a method for detecting self-paced upper limb movement intention from brain signals acquired with both invasive and noninvasive methods. In the first study with scalp electroencephalograph (EEG) signals from healthy controls, we report single trial detection of movement intention using movement related potentials (MRPs) in a frequency range between 0.1 to 1 Hz. Movement intention can be detected above chance level (p<0.05) on average 460 ms before the movement onset with low detection rate during the on-movement intention period. Using intracranial EEG (iEEG) from one epileptic subject, we detect movement intention as early as 1500 ms before movement onset with accuracy above 90% using electrodes implanted in the bilateral supplementary motor area (SMA). The coherent results obtained with non-invasive and invasive method and its generalization capabilities across different days of recording, strengthened the theory that self-paced movement intention can be detected before movement initiation for the advancement in robot-assisted neurorehabilitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years there has been an explosive growth in the development of adaptive and data driven methods. One of the efficient and data-driven approaches is based on statistical learning theory (Vapnik 1998). The theory is based on Structural Risk Minimisation (SRM) principle and has a solid statistical background. When applying SRM we are trying not only to reduce training error ? to fit the available data with a model, but also to reduce the complexity of the model and to reduce generalisation error. Many nonlinear learning procedures recently developed in neural networks and statistics can be understood and interpreted in terms of the structural risk minimisation inductive principle. A recent methodology based on SRM is called Support Vector Machines (SVM). At present SLT is still under intensive development and SVM find new areas of application (www.kernel-machines.org). SVM develop robust and non linear data models with excellent generalisation abilities that is very important both for monitoring and forecasting. SVM are extremely good when input space is high dimensional and training data set i not big enough to develop corresponding nonlinear model. Moreover, SVM use only support vectors to derive decision boundaries. It opens a way to sampling optimization, estimation of noise in data, quantification of data redundancy etc. Presentation of SVM for spatially distributed data is given in (Kanevski and Maignan 2004).