25 resultados para Runs

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The turbot (Scophthalmus maximus) is a commercially valuable flatfish and one of the most promising aquaculture species in Europe. Two transcriptome 454-pyrosequencing runs were used in order to detect Single Nucleotide Polymorphisms (SNPs) in genesrelated to immune response and gonad differentiation. A total of 866 true SNPs were detected in 140 different contigs representing 262,093 bp as a whole. Only one true SNP was analyzed in each contig. One hundred and thirteen SNPs out of the 140 analyzed were feasible (genotyped), while Ш were polymorphic in a wild population. Transition/transversion ratio (1.354) was similar to that observed in other fish studies. Unbiased gene diversity (He) estimates ranged from 0.060 to 0.510 (mean = 0.351), minimum allele frequency (MAF) from 0.030 to 0.500 (mean = 0.259) and all loci were in Hardy-Weinberg equilibrium after Bonferroni correction. A large number of SNPs (49) were located in the coding region, 33 representing synonymous and 16 non-synonymous changes. Most SNP-containing genes were related to immune response and gonad differentiation processes, and could be candidates for functional changes leading to phenotypic changes. These markers will be useful for population screening to look for adaptive variation in wild and domestic turbot

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the literature the outcome of contests is either interpreted as win probabilities or as shares of the prize. With this in mind, we examine two approaches to contest success functions. In the first we analyze the implications of contestants' incomplete information concerning the "type" of the contest administrator. While in the case of two contestants this approach can rationalize prominent contest success functions, we show that it runs into difficulties when there are more agents. Our second approach interprets contest success functions as sharing rules and establishes a connection to bargaining and claims problems which is independent of the number of contestants. Both approaches provide foundations for popular contest success functions and guidelines for the definition of new ones. Keywords: Endogenous Contests, Contest Success Function. JEL Classification: C72 (Noncooperative Games), D72 (Economic Models of Political Processes: Rent-Seeking, Elections), D74 (Conflict; Conflict Resolution; Alliances).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We distinguish and assess three fundamental views of the labor market regarding the movements in unempoyment: (i) the frictionless equilibrium view; (ii) the chain reaction theory, or prolonged adjustment view; and (iii) the hysteresis view. While the frictionless view implies a clear compartmentalization between the short- and long-run, the hysteresis view implies that all the short-run fluctuations automatically turn into long-run changes in the unemployment rate. We assert the problems faced by these conceptions in explaining the diversity of labor market experiences across the OECD labor markets. We argue that the prolonged adjustment view can overcome these problems since it implies that the short, medium, and long runs are interrelated, merging with one another along an intertemporal continuum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current studies, mainly focused on the postwar period, are split on the impact of development on democracy. Examining panel data that runs from early nineteenth century (a time where hardly any democracy was in place) to the end of the twentieth century, I show income matters positively for democratization – both after controlling for country and time effects and instrumenting for income. Since the effect of time partly varies over time, with some historical periods that are more favorable to democracy than others, I investigate the domestic variables (a decreasing marginal effect of growth in already developed economies) and international factors (the strategies of great powers toward small countries) generating that result. I finally probe the underlying processes through which income shapes political institutions, showing that development produces key changes in the distribution and nature of wealth that, in turn, make democracy a stable political outcome.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La gestión de recursos en los procesadores multi-core ha ganado importancia con la evolución de las aplicaciones y arquitecturas. Pero esta gestión es muy compleja. Por ejemplo, una misma aplicación paralela ejecutada múltiples veces con los mismos datos de entrada, en un único nodo multi-core, puede tener tiempos de ejecución muy variables. Hay múltiples factores hardware y software que afectan al rendimiento. La forma en que los recursos hardware (cómputo y memoria) se asignan a los procesos o threads, posiblemente de varias aplicaciones que compiten entre sí, es fundamental para determinar este rendimiento. La diferencia entre hacer la asignación de recursos sin conocer la verdadera necesidad de la aplicación, frente a asignación con una meta específica es cada vez mayor. La mejor manera de realizar esta asignación és automáticamente, con una mínima intervención del programador. Es importante destacar, que la forma en que la aplicación se ejecuta en una arquitectura no necesariamente es la más adecuada, y esta situación puede mejorarse a través de la gestión adecuada de los recursos disponibles. Una apropiada gestión de recursos puede ofrecer ventajas tanto al desarrollador de las aplicaciones, como al entorno informático donde ésta se ejecuta, permitiendo un mayor número de aplicaciones en ejecución con la misma cantidad de recursos. Así mismo, esta gestión de recursos no requeriría introducir cambios a la aplicación, o a su estrategia operativa. A fin de proponer políticas para la gestión de los recursos, se analizó el comportamiento de aplicaciones intensivas de cómputo e intensivas de memoria. Este análisis se llevó a cabo a través del estudio de los parámetros de ubicación entre los cores, la necesidad de usar la memoria compartida, el tamaño de la carga de entrada, la distribución de los datos dentro del procesador y la granularidad de trabajo. Nuestro objetivo es identificar cómo estos parámetros influyen en la eficiencia de la ejecución, identificar cuellos de botella y proponer posibles mejoras. Otra propuesta es adaptar las estrategias ya utilizadas por el Scheduler con el fin de obtener mejores resultados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the recent years most libraries have focused on mass digitization programs and keeping electronic born documents, showing and organizing them in a repository. While those repositories have evolved to a much more manageable systems focusing on the user expectations and introducing web 2.0 tools, digital preservation is still in the to-do list of most of them. There is quite a lot of studies focused on preservation and some complex models exist, unfortunately, very few practical systems are running and its quite difficult for a library to get involved in a solution already tested by others. The CBUC (Consortium of University Catalan Libraries) runs TDX, an ETD repository now keeping more than 10.000 full text thesis from any of the 12 university members. After 10 years running TDX a solid preservation system was needed to ensure every thesis would be kept as it was regardless what happens to the repository. The perfect solution was found in the MetaArchive cooperative, this is the effort of many insitutions to keep a copy of each other content through a newtwork using the LOCKSS software as a mechanism to keep track of any change. The presentation will shortly introduce what TDX and MetaArchive is but will, in a practical way, show how the LOCKSS network for presrervation works. Finally a summary of the benefits of the overall experience will be shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aquesta aplicació es l¿evolució d¿un programa existent, realitzat amb una base de dades d¿escriptori amb capacitats de programació i creació d¿interfícies d¿usuari, que per executar-la cal una llicencia i que ha de funcionar sobre un sistema operatiu privatiu (també amb llicencia). Evoluciona cap a un entorn realitzat sobre programari lliure, que funciona sobre qualsevol sistema operatiu i no es necessari pagar cap llicencia per a fer-lo servir, ni en la part servidora ni en la part client.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we have developed the first free software for mobile devices with the Android operating system that can preventively mitigate the number of contagions of sexually transmitted infections (STI), associated with risk behavior. This software runs in two modes. The normal mode allows the user to see the alerts and nearby health centers. The second mode enables the service to work in the background. This software reports the health risks, as well as the location of different test centers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ADSL is becoming the standard form of residential and small-business broadband Internet access due to, primarily, its low deployment cost. These ADSL residential lines are often deployed with 802.11 Access Points (AP) that providewireless connectivity. Given the density of ADSL deployment, it is often possible for a residential wireless client to be in range of several other APs, belonging to neighbors, with ADSL connectivity. While the ADSL technology has showed evident limits in terms of capacity (with speeds ranging 1-10 Mbps), the short-range wireless communication can guarantee a muchhigher capacity (up to 20 Mbps). Furthermore, the ADSL links in the neighborhood are generally under-utilized, since ADSL subscribers do not connect 100% of the time. Therefore, it is possible for a wireless client to simultaneously connect to several APs in range and effectively aggregate their available ADSL bandwidth.In this paper, we introduce ClubADSL, a wireless client that can simultaneously connect to several APs in range on different frequencies and aggregate both their downlink and uplink capacity. ClubADSL is a software that runs locally on the client-side, and it requires neither modification to the existing Internet infrastructure, nor any hardware/protocol upgradesto the 802.11 local area network. We show the feasibility of ClubADSL in seamlessly transmitting TCP traffic, and validate its implementation both in controlled scenarios and with current applications over real ADSL lines. In particular we show that a ClubADSL client can greatly benefit from the aggregated download bandwidth in the case of server-client applications such as video streaming, but can also take advantage of the increased upload bandwidth greatly reducing download times with incentive-based P2P applications such as BitTorrent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we use a unique long-run dataset of regulatory constraints on capital account openness to explain stock market correlations. Since stock returns themselves are highly volatile, any examination of what drives correlations needs to focus on long runs of data. This is particularly true since some of the short-term changes in co-movements appear to reverse themselves (Delroy Hunter 2005). We argue that changes in the co-movement of indices have not been random. Rather, they are mainly driven by greater freedom to move funds from one country to another. In related work, Geert Bekaert and Campbell Harvey (2000) show that equity correlations increase after liberalization of capital markets, using a number of case studies from emerging countries. We examine this pattern systematically for the last century, and find it to be most pronounced in the recent past. We compare the importance of capital account openness with one main alternative explanation, the growing synchronization of economic fundamentals. We conclude that greater openness has been the single most important cause of growing correlations during the last quarter of a century, though increasingly correlated economic fundamentals also matter. In the conclusion, we offer some thoughts on why the effects of greater openness appear to be so much stronger today than they were during the last era of globalization before 1914.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Spanish automobile industry had a late start. Although the country proved capable of short production runs of high-quality vehicles during the first third of the century it never managed to build up its own industry, unlike Great Britain, France, or Italy. What then, were the critical shortcomings that prevented the establishment of large Spanish motor manufacturers? Put another way, why did all of the companies set up during the first half-century fail to survive? This paper attempts to shed some light on these questions, employing a wide-ranging analysis of both internal and external factors affecting the industry. A feeble internal market, lack of resources and production factors are usually adduced as reasons, as are Spain's general economic backwardness and the role played by the public authorities. However, this paper mainly focuses on the internal factors concerning company strategy and organisation. A comparison with the Italian case helps put the traditional arguments in proper perspective and highlights those covering business strategies. Finally, we argue that a broad range of factors needs to be analysed to fully understand why Spain failed to establish a motor industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study relative performance evaluation in executive compensation whenexecutives have private information about their ability. We assume that thejoint distribution of an individual firm s profit and market movements dependson the ability of the executive that runs the firm. In the equilibrium of theexecutive labor market, compensation schemes exploit this fact to sortexecutives of di ?erent abilities. This implies that executive compensation isincreasing in own performance, but may also be increasing in industryperformance-a sharp departure from standard relative performance evaluation.This result provides an explanation for the scarcity of relative performanceconsiderations in executive compensation documented by the empirical literature.