994 resultados para Software Transaction Memory
Resumo:
Report for the scientific sojourn at the German Aerospace Center (DLR) , Germany, during June and July 2006. The main objective of the two months stay has been to apply the techniques of LEO (Low Earth Orbiters) satellites GPS navigation which DLR currently uses in real time navigation. These techniques comprise the use of a dynamical model which takes into account the precise earth gravity field and models to account for the effects which perturb the LEO’s motion (such as drag forces due to earth’s atmosphere, solar pressure, due to the solar radiation impacting on the spacecraft, luni-solar gravity, due to the perturbation of the gravity field for the sun and moon attraction, and tidal forces, due to the ocean and solid tides). A high parameterized software was produced in the first part of work, which has been used to asses which accuracy could be reached exploring different models and complexities. The objective was to study the accuracy vs complexity, taking into account that LEOs at different heights have different behaviors. In this frame, several LEOs have been selected in a wide range of altitudes, and several approaches with different complexity have been chosen. Complexity is a very important issue, because processors onboard spacecrafts have very limited computing and memory resources, so it is mandatory to keep the algorithms simple enough to let the satellite process it by itself.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
A major achievement of new institutionalism in economics and political science is the formalisation of the idea that certain policies are more efficient when administered by a politically independent organisation. Based on this insight, several policy actors and scholars criticise the European Community for relying too much on a multi-task, collegial, and politicised organisation, the European Commission. This raises important questions, some constitutional (who should be able to change the corresponding procedural rules?) and some political-economic (is Europe truly committed to free and competitive markets?). Though acknowledging the relevance of legal and normative arguments, this paper contributes to the debate with a positive political-scientific perspective. Based on the view that institutional equilibria raise the question of equilibrium institutions, it shows that collegiality was (a) an equilibrium institution during the Paris negotiations of 1950-51; and (b) an institutional equilibrium for the following 50 years. The conclusion points to some recent changes in the way that European competition policy is implemented, and discusses how these affect the “constitutional” principle of collegial European governance.
Resumo:
This paper reports on: (a) new primary source evidence on; and (b) statistical and econometric analysis of high technology clusters in Scotland. It focuses on the following sectors: software, life sciences, microelectronics, optoelectronics, and digital media. Evidence on a postal and e-mailed questionnaire is presented and discussed under the headings of: performance, resources, collaboration & cooperation, embeddedness, and innovation. The sampled firms are characterised as being small (viz. micro-firms and SMEs), knowledge intensive (largely graduate staff), research intensive (mean spend on R&D GBP 842k), and internationalised (mainly selling to markets beyond Europe). Preliminary statistical evidence is presented on Gibrat’s Law (independence of growth and size) and the Schumpeterian Hypothesis (scale economies in R&D). Estimates suggest a short-run equilibrium size of just 100 employees, but a long-run equilibrium size of 1000 employees. Further, to achieve the Schumpeterian effect (of marked scale economies in R&D), estimates suggest that firms have to grow to very much larger sizes of beyond 3,000 employees. We argue that the principal way of achieving the latter scale may need to be by takeovers and mergers, rather than by internally driven growth.
Resumo:
El objetivo del proyecto es diseñar una plataforma de ensayos para la simulación de vuelos de vehículos aeroespaciales. La plataforma permitirá diseñar y evaluar los algoritmos de navegación, guiado y control de los vehículos aeroespaciales modelados en la plataforma de simulación, focalizando el trabajo de los ingenieros en el modelado de vehículos y en el desarrollo de sistemas de control digital. La memoria recoge las fases de un proyecto de ingeniería del software, describiendo el plan de proyecto, el análisis del sistema, la especificación de requisitos y el diseño del mismo.
Resumo:
El present treball compta amb dues parts. La primera es una recopilació sobre temes relacionats amb el correu electrònic i el seu ús: la seva història; els elements que el composen; serveis i programes que ofereix, l´ús d’aquesta eina; l’importància d´aquest dins del e-marketing; la seva efectivitat com a eina de marketing; atributs que se li assignen; les seves principals aplicacions; legislació que el regula; i altres dades que poden ser de gran utilitat a l´hora de fer una tramesa de correu electrònic. La segona part d´aquest treball conté una investigació quantitativa sobre alguns elements o variables que poden influir en l´efectivitat final de la tramesa massiva de correus electrònics realitzada per una empresa amb finalitat comercial.
Resumo:
This paper reviews the evidence on the effects of recessions on potential output. In contrast to the assumption in mainstream macroeconomic models that economic fluctuations do not change potential output paths, the evidence is that they do in the case of recessions. A model is proposed to explain this phenomenon, based on an analogy with water flows in porous media. Because of the discrete adjustments made by heterogeneous economic agents in such a world, potential output displays hysteresis with regard to aggregate demand shocks, and thus retains a memory of the shocks associated with recessions.
Resumo:
This paper proposes a simple framework for understanding endogenous transaction costs - their composition, size and implications. In a model of diversification against risk, we distinguish between investments in institutions that facilitate exchange and the costs of conducting exchange itself. Institutional quality and market size are determined by the decisions of risk averse agents and conditions are discussed under which the efficient allocation may be decentralized. We highlight a number of differences with models where transaction costs are exogenous, including the implications for taxation and measurement issues.
Resumo:
Starting from the observation that ghosts are strikingly recurrent and prominent figures in late-twentieth African diasporic literature, this dissertation proposes to account for this presence by exploring its various functions. It argues that, beyond the poetic function the ghost performs as metaphor, it also does cultural, theoretical and political work that is significant to the African diaspora in its dealings with issues of history, memory and identity. Toni Morrison's Beloved (1987) serves as a guide for introducing the many forms, qualities and significations of the ghost, which are then explored and analyzed in four chapters that look at Fred D'Aguiar's Feeding the Ghosts (1998), Gloria Naylor's Mama Day (1988), Paule Marshall's Praisesong for the Widow (1983) and a selection of novels, short stories and poetry by Michelle Cliff. Moving thematically through these texts, the discussion shifts from history through memory to identity as it examines how the ghost trope allows the writers to revisit sites of trauma; revise historical narratives that are constituted and perpetuated by exclusions and invisibilities; creatively and critically repossess a past marked by violence, dislocation and alienation and reclaim the diasporic culture it contributed to shaping; destabilize and deconstruct the hegemonic, normative categories and boundaries that delimit race or sexuality and envision other, less limited and limiting definitions of identity. These diverse and interrelated concerns are identified and theorized as participating in a project of "re-vision," a critical project that constitutes an epistemological as much as a political gesture. The author-based structure allows for a detailed analysis of the texts and highlights the distinctive shapes the ghost takes and the particular concerns it serves to address in each writer's literary and political project. However, using the ghost as a guide into these texts, taken collectively, also throws into relief new connections between them and sheds light on the complex ways in which the interplay of history, memory and identity positions them as products of and contributions to an African diasporic (literary) culture. If it insists on the cultural specificity of African diasporic ghosts, tracing its origins to African cultures and spiritualities, the argument also follows gothic studies' common view that ghosts in literary and cultural productions-like other related figures of the living dead-respond to particular conditions and anxieties. Considering the historical and political context in which the texts under study were produced, the dissertation makes connections between the ghosts in them and African diasporic people's disillusionment with the broken promises of the civil rights movement in the United States and of postcolonial independence in the Caribbean. It reads the texts' theoretical concerns and narrative qualities alongside the contestation of traditional historiography by black and postcolonial studies as well as the broader challenge to conventional notions such as truth, reality, meaning, power or identity by poststructuralism, postcolonialism or queer theory. Drawing on these various theoretical approaches and critical tools to elucidate the ghost's deconstructive power for African diasporic writers' concerns, this work ultimately offers a contribution to "speciality studies," which is currently emerging as a new field of scholarship in cultural theory.
Resumo:
Induction of cytotoxic CD8 T-cell responses is enhanced by the exclusive presentation of antigen through dendritic cells, and by innate stimuli, such as toll-like receptor ligands. On the basis of these 2 principles, we designed a vaccine against melanoma. Specifically, we linked the melanoma-specific Melan-A/Mart-1 peptide to virus-like nanoparticles loaded with A-type CpG, a ligand for toll-like receptor 9. Melan-A/Mart-1 peptide was cross-presented, as shown in vitro with human dendritic cells and in HLA-A2 transgenic mice. A phase I/II study in stage II-IV melanoma patients showed that the vaccine was well tolerated, and that 14/22 patients generated ex vivo detectable T-cell responses, with in part multifunctional T cells capable to degranulate and produce IFN-γ, TNF-α, and IL-2. No significant influence of the route of immunization (subcutaneous versus intradermal) nor dosing regimen (weekly versus daily clusters) could be observed. It is interesting to note that, relatively large fractions of responding specific T cells exhibited a central memory phenotype, more than what is achieved by other nonlive vaccines. We conclude that vaccination with CpG loaded virus-like nanoparticles is associated with a human CD8 T-cell response with properties of a potential long-term immune protection from the disease.
Resumo:
This paper proposes a simple model for understanding transaction costs for their composition, size and policy implications. We distinguish between investments in institutions that facilitate exchange and the cost of conducting exchange itself. Institutional quality and market size are determined by the decisions of risk averse agents and conditions are discussed under which the efficient allocation may be decentralized. We highlight a number of differences with models where transaction costs are exogenous, including the implications for taxation and measurement issues.
Resumo:
This paper develops a new test of true versus spurious long memory, based on log-periodogram estimation of the long memory parameter using skip-sampled data. A correction factor is derived to overcome the bias in this estimator due to aliasing. The procedure is designed to be used in the context of a conventional test of significance of the long memory parameter, and composite test procedure described that has the properties of known asymptotic size and consistency. The test is implemented using the bootstrap, with the distribution under the null hypothesis being approximated using a dependent-sample bootstrap technique to approximate short-run dependence following fractional differencing. The properties of the test are investigated in a set of Monte Carlo experiments. The procedure is illustrated by applications to exchange rate volatility and dividend growth series.
Resumo:
The idea of Chineseness as a geographic, cultural-specific and ethnically-charged concept, and the pivotal role assumed by memory linger throughout the writings of most authors hailing from Chinese community in Southeast Asia. Among these communities, being the Malaysian Chinese the more prolific in terms of number of writers and pieces of literature produced, this paper deals specifically with it. Its focus is put on the literature produced by Malaysian Chinese authors residing in Taiwan, which topic constitutes an important part of the first chapter, and on one of its main representatives, Ng Kim Chew, to whom chapter two and three are fully dedicated. A literary analysis of one of his short stories, Huo yu tu, will allow the reader to have a first-hand experience, through excerpts from the original text, of the importance of Chineseness and memory in the literary production of Ng and of many authors sharing with him similar life and literary experiences. I started this research from the assumption that these authors make large use of their own memories and memories from their own community in their writing as a way to re-tie themselves to the Chineseness they left in their places of origin. However in the case of Ng Kim Chew, the analysis of his works led be to theorizing that the identity he is imbued with, if there is one, is not Chinese, nor Malaysian, but purely and distinctively Malaysian-Chinese. This paper can also serve as an introduction for the general public to the field of Sinophone literature from Southeast Asia and to promote wider and innovative paths of research within the realm of Chinese studies that go beyond China proper.
Resumo:
Performance analysis is the task of monitor the behavior of a program execution. The main goal is to find out the possible adjustments that might be done in order improve the performance. To be able to get that improvement it is necessary to find the different causes of overhead. Nowadays we are already in the multicore era, but there is a gap between the level of development of the two main divisions of multicore technology (hardware and software). When we talk about multicore we are also speaking of shared memory systems, on this master thesis we talk about the issues involved on the performance analysis and tuning of applications running specifically in a shared Memory system. We move one step ahead to take the performance analysis to another level by analyzing the applications structure and patterns. We also present some tools specifically addressed to the performance analysis of OpenMP multithread application. At the end we present the results of some experiments performed with a set of OpenMP scientific application.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. L’objectiu d’aquest treball de recerca és la creació d’un dispositiu encarregat de centralitzar totes les necessitats multimèdia de casa nostra i distribuir aquest contingut a tots els terminals de la xarxa local d’una manera senzilla i automatitzada. Aquest dispositiu s’ha dissenyat per estar connectat a una televisió d’alta definició, que permetrà la reproducció i l’organització de tot el nostre multimèdia d’una manera còmoda i fàcil. El media center s’encarrega de gestionar la nostra filmoteca, fototeca, biblioteca musical i sèries de TV de manera transparent i automàtica. A més a més, l’usuari pot accedir a tot el multimèdia emmagatzemat al media center des de qualsevol dispositiu de la xarxa local a través de protocols com CIFS o UPnP, en un intent de replicar el cloud computing a escala local. El dispositiu ha estat dissenyat per a suportar tot tipus de formats i subtítols, assegurant la compatibilitat total amb arxius lliures de DRM. El seu disseny minimalista i silenciós el fa perfecte per a substituir el reproductor de DVD de la sala. Tot això sense oblidar el seu baix consum, de l’ordre d’un 75% inferior al d’un PC convencional.