949 resultados para Symbolic Computation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Throughout history, nuclear weapons have been considered to be the ultimate weapons. This understanding largely detached them from the portfolio of conventional military means and assigned them a symbolic meaning that influenced the identity and norms creation of nations. In most countries today, the development of nuclear weapons is considered morally prohibitive, incompatible with a country’s identity and international outlook. In some states, however, these negative norms are overridden by a positive set of norms, causing nuclear weapons to become either symbols of invulnerability to perceived threats or the regalia of major power status. Main purpose of this paper is to explore on the conditions that cause most states to develop a moral aversion to nuclear weapons, yet effectively lead to their glorification in others. Many studies on the normative understanding of nuclear weapons consider the existence of a negative normative predisposition, often referred to as ‘nuclear taboo’, as a major factor in preventing their acquisition and use. Other studies acknowledge the existence of a nuclear taboo inhibiting the use of nuclear weapons, but point to the existence of the opposing effect of norms, frequently referred to as the ‘nuclear myth’, when it comes to the acquisition of nuclear weapons. This myth emerges when certain symbolic meanings are attached to nuclear weapons, such as a state’s identity, self-image, and its desired position in the international system. With 180 odd countries in the world abstaining from the acquisition of nuclear weapons and 8 countries in possession of them (with two further countries assumed to have pursued their acquisition), one might consider the dominance of the nuclear taboo over the nuclear myth to be the rule. The core question is thus why and how this relationship reversed in the case of defectors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two claims pervade the literature on the political economy of market reforms: that economic crises cause reforms; and that crises matter because they bring into question the validity of the economic model held to be responsible for them. Economic crises are said to spur a process of learning that is conducive to the abandonment of failing models and to the adoption of successful models. But although these claims have become the conventional wisdom, they have been hardly tested empirically due to the lack of agreement on what constitutes a crisis and to difficulties in measuring learning from them. I propose a model of rational learning from experience and apply it to the decision to open the economy. Using data from 1964 through 1990, I show that learning from the 1982 debt crisis was relevant to the first wave of adoption of an export promotion strategy, but learning was conditional on the high variability of economic outcomes in countries that opened up to trade. Learning was also symbolic in that the sheer number of other countries that liberalized was a more important driver of others’ decisions to follow suit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El projecte "Anàlisi del sistema operatiu RTLinux i implementació d'un entorn de desenvolupament de tasques en temps real" analitza la possibilitat de crear un entorn de desenvolupament de tasques en temps real per poder crear sistemes de control complex, tot això mitjançant codi lliure. Inicialment es fa un aprenentatge sobre el concepte de temps real, després s'elegeix el sistema operatiu en temps real RTLinux per a crear l'entorn de desenvolupament utilitzant el llenguatge de programació Tcl/Tk. Es creen un conjunt d'aplicacions (pel control computacional) per estudiar la viabilitat de la construcció de l'entorn desitjat per facilitar la tasca de l'usuari final. Aquest projecte obre multitud de possibles camins a continuar: comunicació remota, implementació de planificadors, estudi de controladors, etc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte té com a objectiu participar en el desafiament d'RSA Laboratories corresponent a trencar el criptosistema RC5-32-12-9 proposat. Per realitzar-ho s'ha triat realitzar un atac per força bruta, mitjançant el càlcul distribuït i, més concretament, utilitzant la Public Resource Computing. La plataforma escollida és la Berkeley Open Infrastructure for Network Computing (BOINC), coneguda per la seva utilització en grans projectes com ara SETI@home. En aquest projecte es posa en funcionament la infraestructura i es desenvolupen les aplicacions necessàries per iniciar els càlculs que haurien de permetre el trencament del criptosistema.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sound localization relies on the analysis of interaural time and intensity differences, as well as attenuation patterns by the outer ear. We investigated the relative contributions of interaural time and intensity difference cues to sound localization by testing 60 healthy subjects: 25 with focal left and 25 with focal right hemispheric brain damage. Group and single-case behavioural analyses, as well as anatomo-clinical correlations, confirmed that deficits were more frequent and much more severe after right than left hemispheric lesions and for the processing of interaural time than intensity difference cues. For spatial processing based on interaural time difference cues, different error types were evident in the individual data. Deficits in discriminating between neighbouring positions occurred in both hemispaces after focal right hemispheric brain damage, but were restricted to the contralesional hemispace after focal left hemispheric brain damage. Alloacusis (perceptual shifts across the midline) occurred only after focal right hemispheric brain damage and was associated with minor or severe deficits in position discrimination. During spatial processing based on interaural intensity cues, deficits were less severe in the right hemispheric brain damage than left hemispheric brain damage group and no alloacusis occurred. These results, matched to anatomical data, suggest the existence of a binaural sound localization system predominantly based on interaural time difference cues and primarily supported by the right hemisphere. More generally, our data suggest that two distinct mechanisms contribute to: (i) the precise computation of spatial coordinates allowing spatial comparison within the contralateral hemispace for the left hemisphere and the whole space for the right hemisphere; and (ii) the building up of global auditory spatial representations in right temporo-parietal cortices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose an elementary theory of wars fought by fully rational contenders. Two parties play a Markov game that combines stages of bargaining with stages where one side has the ability to impose surrender on the other. Under uncertainty and incomplete information, in the unique equilibrium of the game, long confrontations occur: war arises when reality disappoints initial (rational) optimism, and it persist longer when both agents are optimists but reality proves both wrong. Bargaining proposals that are rejected initially might eventually be accepted after several periods of confrontation. We provide an explicit computation of the equilibrium, evaluating the probability of war, and its expected losses as a function of i) the costs of confrontation, ii) the asymmetry of the split imposed under surrender, and iii) the strengths of contenders at attack and defense. Changes in these parameters display non-monotonic effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: Organ transplantation is a biological and psychological challenge and graft acceptance is an important achievement for patients. Patients' concerns toward the deceased donor and the organ may contribute to this process. Method: Forty-seven patients involved in heart (N=9), liver (N=8), lung (N=14) and kidney (N=16) transplantation participated in IRB-approved longitudinal semi-structured interviews: (T1) registered on the waiting-list, (T2) six months and (T3) twelve months after transplantation. Qualitative pattern analysis (QUAPA) was carried out on the verbatim transcripts and concerns about the donor and the organ were then analysed. Results: - Donor's representation: At T1, patients were reluctant to talk about the donor: 27% expressed culpability and 19% accepted the clause of anonymity. At T2, intense emotions were associated with the reminiscing about the donor and 45% highlighted the generosity of his/her act. In addition, heart, lung and kidney recipients were concerned about the donor's identity: 42% challenged the clause of anonymity. Liver recipients complained about anonymity, but could nevertheless cope with it. At T3, 47% of heart, lung and kidney recipients thought daily of the donor and 33% were still looking for information about him/her. Liver recipients rarely have thoughts about the donor. - Organ representation: At T1, organ descriptions were biomedical (49% of the interviewees) and more rarely, mainly heart candidates, referred to the symbolic meaning of the organ. After transplantation (T2-T3), function was underlined. Acceptance and organ integration were associated with post-operative outcomes (23%) and psychological well-being (45%). Some patients (32%) inferred the donor's personality from the organ quality and felt privileged having received an organ in such a good state. Conclusion: Donor's representations should be explored during the transplantation process as they play an important role in the psychological acceptance of the graft.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes full-Bayes priors for time-varying parameter vector autoregressions (TVP-VARs) which are more robust and objective than existing choices proposed in the literature. We formulate the priors in a way that they allow for straightforward posterior computation, they require minimal input by the user, and they result in shrinkage posterior representations, thus, making them appropriate for models of large dimensions. A comprehensive forecasting exercise involving TVP-VARs of different dimensions establishes the usefulness of the proposed approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabajo analiza el rendimiento de cuatro nodos de cómputo multiprocesador de memoria compartida para resolver el problema N-body. Se paraleliza el algoritmo serie, y se codifica usando el lenguaje C extendido con OpenMP. El resultado son dos variantes que obedecen a dos criterios de optimización diferentes: minimizar los requisitos de memoria y minimizar el volumen de cómputo. Posteriormente, se realiza un proceso de análisis de las prestaciones del programa sobre los nodos de cómputo. Se modela el rendimiento de las variantes secuenciales y paralelas de la aplicación, y de los nodos de cómputo; se instrumentan y ejecutan los programas para obtener resultados en forma de varias métricas; finalmente se muestran e interpretan los resultados, proporcionando claves que explican ineficiencias y cuellos de botella en el rendimiento y posibles líneas de mejora. La experiencia de este estudio concreto ha permitido esbozar una incipiente metodología de análisis de rendimiento, identificación de problemas y sintonización de algoritmos a nodos de cómputo multiprocesador de memoria compartida.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquest projecte es tracta de la optimització i la implementació de l’etapa d’adquisició d’un receptor GPS. També inclou una revisió breu del sistema GPS i els seus principis de funcionament. El procés d’adquisició s’ha estudiat amb detall i programat en els entorns de treball Matlab i Simulink. El fet d’implementar aquesta etapa en dos entorns diferents ha estat molt útil tant de cara a l’aprenentatge com també per la comprovació dels resultats obtinguts. El principal objectiu del treball és el disseny d’un model Simulink que es capaç d’adquirir una senyal capturada amb hardware real. En realitat, s’han fet dues implementacions: una que utilitza blocs propis de Simulink i l’altra que utilitza blocs de la llibreria Xilinx. D’aquesta manera, posteriorment, es facilitaria la transició del model a la FPGA utilitzant l’entorn ISE de Xilinx. La implementació de l’etapa d’adquisició es basa en el mètode de cerca de fase de codi en paral·lel, el qual empra la operació correlació creuada mitjançant la transformada ràpida de Fourier (FFT). Per aquest procés es necessari realitzar dues transformades (per a la senyal entrant i el codi de referència) i una antitransformada de Fourier (per al resultat de la correlació). Per tal d’optimitzar el disseny s’utilitza un bloc FFT, ja que tres blocs consumeixen gran part dels recursos d’una FPGA. En lloc de replicar el bloc FFT, en el model el bloc és compartit en el temps gràcies a l’ús de buffers i commutadors, com a resultat la quantitat de recursos requerits per una implementació en una FPGA es podria reduir considerablement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To describe the collective behavior of large ensembles of neurons in neuronal network, a kinetic theory description was developed in [13, 12], where a macroscopic representation of the network dynamics was directly derived from the microscopic dynamics of individual neurons, which are modeled by conductance-based, linear, integrate-and-fire point neurons. A diffusion approximation then led to a nonlinear Fokker-Planck equation for the probability density function of neuronal membrane potentials and synaptic conductances. In this work, we propose a deterministic numerical scheme for a Fokker-Planck model of an excitatory-only network. Our numerical solver allows us to obtain the time evolution of probability distribution functions, and thus, the evolution of all possible macroscopic quantities that are given by suitable moments of the probability density function. We show that this deterministic scheme is capable of capturing the bistability of stationary states observed in Monte Carlo simulations. Moreover, the transient behavior of the firing rates computed from the Fokker-Planck equation is analyzed in this bistable situation, where a bifurcation scenario, of asynchronous convergence towards stationary states, periodic synchronous solutions or damped oscillatory convergence towards stationary states, can be uncovered by increasing the strength of the excitatory coupling. Finally, the computation of moments of the probability distribution allows us to validate the applicability of a moment closure assumption used in [13] to further simplify the kinetic theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent strides of democracy in Latin America have been associated to conflicting outcomes. The expectation that democracy would bring about peace and prosperity have been only partly satisfied. While political violence has been by and large eradicated from the sub-continent, poverty and social injustice still prevail and hold sway. Our study argues that democracy matters for inequality through the growing strength of center left and left parties and by making political leaders in general more responsive to the underprivileged. Furthermore, although the pension reforms recently enacted in the region generated overall regressive outcomes on income distribution, democratic countries still benefit from their political past: where democratic tradition was stronger, such outcomes have been milder. Democratic tradition and the specific ideological connotations of the parties in power, on the other hand, did not play an equally crucial role in securing lower levels of political violence: during the last wave of democratizations in Latin America, domestic peace was rather an outcome of political and social concessions to those in distress. In sum, together with other factors and especially economic ones, the reason why recent democratizations have provided domestic peace in most cases, but have been unable so far to solve the problem of poverty and inequality, is that democratic traditions in the subcontinent have been relatively weak and, more specifically, that this weakness has undermined the growth of left and progressive parties, acting as an obstacle to redistribution. Such weakness, on the other hand, has not prevented the drastic reduction of domestic political violence, since what mattered in this case was a combination of symbolic or material concessions and political agreements among powerful élites and counter-élites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article addresses the normative dilemma located within the application of `securitization,’ as a method of understanding the social construction of threats and security policies. Securitization as a theoretical and practical undertaking is being increasingly used by scholars and practitioners. This scholarly endeavour wishes to provide those wishing to engage with securitization with an alternative application of this theory; one which is sensitive to and self-reflective of the possible normative consequences of its employment. This article argues that discussing and analyzing securitization processes have normative implications, which is understood here to be the negative securitization of a referent. The negative securitization of a referent is asserted to be carried out through the unchallenged analysis of securitization processes which have emerged through relations of exclusion and power. It then offers a critical understanding and application of securitization studies as a way of overcoming the identified normative dilemma. First, it examines how the Copenhagen School’s formation of securitization theory gives rise to a normative dilemma, which is situated in the performative and symbolic power of security as a political invocation and theoretical concept. Second, it evaluates previous attempts to overcome the normative dilemma of securitization studies, outlining the obstacles that each individual proposal faces. Third, this article argues that the normative dilemma of applying securitization can be avoided by firstly, deconstructing the institutional power of security actors and dominant security subjectivities and secondly, by addressing countering or alternative approaches to security and incorporating different security subjectivities. Examples of the securitization of international terrorism and immigration are prominent throughout.