945 resultados para Runs
Resumo:
Tese de Doutoramento (Programa doutoral em Engenharia de Materiais)
Resumo:
Dissertação de mestrado em Comunicação, Arte e Cultura
Simultaneous detection of cyclopiazonic acid and aflatoxin B1 by HPLC in methanol/water mobile phase
Resumo:
A simple procedure for the simultaneous detection of cyclopiazonic acid (CPA) and aflatoxin B1 from fungal extracts is presented, using a methanol and water mobile phase and fluorescence detection. This methodology has been tested with standard solutions of both mycotoxins CPA and Aflatoxin B1 and with methanolic extracts of Aspergillus section Flavi strains, previously characterized for their mycotoxin production profile. Previously available methodology required the use of two different chromatographic runs for these mycotoxins, with distinct columns and detectors (fluorescence detection with a post-column photochemical derivatization (PHRED) for aflatoxin B1 and UV detection for CPA). The proposed method detects both mycotoxins in a single run. Data from these assays will be presented and discussed.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
This note describes ParallelKnoppix, a bootable CD that allows econometricians with average knowledge of computers to create and begin using a high performance computing cluster for parallel computing in very little time. The computers used may be heterogeneous machines, and clusters of up to 200 nodes are supported. When the cluster is shut down, all machines are in their original state, so their temporary use in the cluster does not interfere with their normal uses. An example shows how a Monte Carlo study of a bootstrap test procedure may be done in parallel. Using a cluster of 20 nodes, the example runs approximately 20 times faster than it does on a single computer.
Resumo:
One of the main implications of the efficient market hypothesis (EMH) is that expected future returns on financial assets are not predictable if investors are risk neutral. In this paper we argue that financial time series offer more information than that this hypothesis seems to supply. In particular we postulate that runs of very large returns can be predictable for small time periods. In order to prove this we propose a TAR(3,1)-GARCH(1,1) model that is able to describe two different types of extreme events: a first type generated by large uncertainty regimes where runs of extremes are not predictable and a second type where extremes come from isolated dread/joy events. This model is new in the literature in nonlinear processes. Its novelty resides on two features of the model that make it different from previous TAR methodologies. The regimes are motivated by the occurrence of extreme values and the threshold variable is defined by the shock affecting the process in the preceding period. In this way this model is able to uncover dependence and clustering of extremes in high as well as in low volatility periods. This model is tested with data from General Motors stocks prices corresponding to two crises that had a substantial impact in financial markets worldwide; the Black Monday of October 1987 and September 11th, 2001. By analyzing the periods around these crises we find evidence of statistical significance of our model and thereby of predictability of extremes for September 11th but not for Black Monday. These findings support the hypotheses of a big negative event producing runs of negative returns in the first case, and of the burst of a worldwide stock market bubble in the second example. JEL classification: C12; C15; C22; C51 Keywords and Phrases: asymmetries, crises, extreme values, hypothesis testing, leverage effect, nonlinearities, threshold models
Resumo:
In the literature the outcome of contests is either interpreted as win probabilities or as shares of the prize. With this in mind, we examine two approaches to contest success functions. In the first we analyze the implications of contestants' incomplete information concerning the "type" of the contest administrator. While in the case of two contestants this approach can rationalize prominent contest success functions, we show that it runs into difficulties when there are more agents. Our second approach interprets contest success functions as sharing rules and establishes a connection to bargaining and claims problems which is independent of the number of contestants. Both approaches provide foundations for popular contest success functions and guidelines for the definition of new ones. Keywords: Endogenous Contests, Contest Success Function. JEL Classification: C72 (Noncooperative Games), D72 (Economic Models of Political Processes: Rent-Seeking, Elections), D74 (Conflict; Conflict Resolution; Alliances).
Resumo:
We distinguish and assess three fundamental views of the labor market regarding the movements in unempoyment: (i) the frictionless equilibrium view; (ii) the chain reaction theory, or prolonged adjustment view; and (iii) the hysteresis view. While the frictionless view implies a clear compartmentalization between the short- and long-run, the hysteresis view implies that all the short-run fluctuations automatically turn into long-run changes in the unemployment rate. We assert the problems faced by these conceptions in explaining the diversity of labor market experiences across the OECD labor markets. We argue that the prolonged adjustment view can overcome these problems since it implies that the short, medium, and long runs are interrelated, merging with one another along an intertemporal continuum.
Resumo:
This paper uses a computable general equilibrium (CGE) framework to investigate the conditions under which rebound effects may occur in response to increases in energy efficiency in the UK national economy. Previous work for the UK has suggested that rebound effects will occur even where key elasticities of substitution in production are set close to zero. The research reported in this paper involves carrying out a systematic sensitivity analysis, where relative price sensitivity is gradually introduced into the system, focusing specifically on elasticities of substitution in production and trade parameters, in order to determine conditions under which rebound effects become a likely outcome. The main result is that, while there is positive pressure for rebound effects even where (direct and indirect) demands for energy are very price inelastic, this may be partially or wholly offset by negative income, competitiveness and disinvestment effects, which also occur in response to falling energy prices. The occurrence of disinvestment effects is of particular interest. These occur where falling energy prices reduce profitability in domestic energy supply sectors, leading to a contraction in capital stock in these sectors, which may in turn lead to rebound effects that are smaller in the long run than in the short run, a result that runs contrary to the predictions of previous theoretical work in this area.
Resumo:
Research into the biomechanical manifestation of fatigue during exhaustive runs is increasingly popular but additional understanding of the adaptation of the spring-mass behaviour during the course of strenuous, self-paced exercises continues to be a challenge in order to develop optimized training and injury prevention programs. This study investigated continuous changes in running mechanics and spring-mass behaviour during a 5-km run. 12 competitive triathletes performed a 5-km running time trial (mean performance: 17 min 30 s) on a 200 m indoor track. Vertical and anterior-posterior ground reaction forces were measured every 200 m by a 5-m long force platform system, and used to determine spring-mass model characteristics. After a fast start, running velocity progressively decreased (- 11.6%; P<0.001) in the middle part of the race before an end spurt in the final 400-600 m. Stride length (- 7.4%; P<0.001) and frequency (- 4.1%; P=0.001) decreased over the 25 laps, while contact time (+ 8.9%; P<0.001) and total stride duration (+ 4.1%; P<0.001) progressively lengthened. Peak vertical forces (- 2.0%; P<0.01) and leg compression (- 4.3%; P<0.05), but not centre of mass vertical displacement (+ 3.2%; P>0.05), decreased with time. As a result, vertical stiffness decreased (- 6.0%; P<0.001) during the run, whereas leg stiffness changes were not significant (+ 1.3%; P>0.05). Spring-mass behaviour progressively changes during a 5-km time trial towards deteriorated vertical stiffness, which alters impact and force production characteristics.
Resumo:
This paper is an investigation into the dynamics of asset markets with adverse selection a la Akerlof (1970). The particular question asked is: can market failure at some later date precipitate market failure at an earlier date? The answer is yes: there can be "contagious illiquidity" from the future back to the present. The mechanism works as follows. If the market is expected to break down in the future, then agents holding assets they know to be lemons (assets with low returns) will be forced to hold them for longer - they cannot quickly resell them. As a result, the effective difference in payoff between a lemon and a good asset is greater. But it is known from the static Akerlof model that the greater the payoff differential between lemons and non-lemons, the more likely is the market to break down. Hence market failure in the future is more likely to lead to market failure today. Conversely, if the market is not anticipated to break down in the future, assets can be readily sold and hence an agent discovering that his or her asset is a lemon can quickly jettison it. In effect, there is little difference in payoff between a lemon and a good asset. The logic of the static Akerlof model then runs the other way: the small payoff differential is unlikely to lead to market breakdown today. The conclusion of the paper is that the nature of today's market - liquid or illiquid - hinges critically on the nature of tomorrow's market, which in turn depends on the next day's, and so on. The tail wags the dog.
Resumo:
Bank crises, by interrupting liquidity provision, have been viewed as resulting in welfare losses. In a model of banking with moral hazard, we show that second best bank contracts that improve on autarky ex ante require costly crises to occur with positive probability at the interim stage. When bank payoffs are partially appropriable, either directly via imposition of fines or indirectly by the use of bank equity as a collateral, we argue that an appropriately designed ex-ante regime of policy intervention involving conditional monitoring can prevent bank crises.
Resumo:
Current studies, mainly focused on the postwar period, are split on the impact of development on democracy. Examining panel data that runs from early nineteenth century (a time where hardly any democracy was in place) to the end of the twentieth century, I show income matters positively for democratization – both after controlling for country and time effects and instrumenting for income. Since the effect of time partly varies over time, with some historical periods that are more favorable to democracy than others, I investigate the domestic variables (a decreasing marginal effect of growth in already developed economies) and international factors (the strategies of great powers toward small countries) generating that result. I finally probe the underlying processes through which income shapes political institutions, showing that development produces key changes in the distribution and nature of wealth that, in turn, make democracy a stable political outcome.
Resumo:
La gestión de recursos en los procesadores multi-core ha ganado importancia con la evolución de las aplicaciones y arquitecturas. Pero esta gestión es muy compleja. Por ejemplo, una misma aplicación paralela ejecutada múltiples veces con los mismos datos de entrada, en un único nodo multi-core, puede tener tiempos de ejecución muy variables. Hay múltiples factores hardware y software que afectan al rendimiento. La forma en que los recursos hardware (cómputo y memoria) se asignan a los procesos o threads, posiblemente de varias aplicaciones que compiten entre sí, es fundamental para determinar este rendimiento. La diferencia entre hacer la asignación de recursos sin conocer la verdadera necesidad de la aplicación, frente a asignación con una meta específica es cada vez mayor. La mejor manera de realizar esta asignación és automáticamente, con una mínima intervención del programador. Es importante destacar, que la forma en que la aplicación se ejecuta en una arquitectura no necesariamente es la más adecuada, y esta situación puede mejorarse a través de la gestión adecuada de los recursos disponibles. Una apropiada gestión de recursos puede ofrecer ventajas tanto al desarrollador de las aplicaciones, como al entorno informático donde ésta se ejecuta, permitiendo un mayor número de aplicaciones en ejecución con la misma cantidad de recursos. Así mismo, esta gestión de recursos no requeriría introducir cambios a la aplicación, o a su estrategia operativa. A fin de proponer políticas para la gestión de los recursos, se analizó el comportamiento de aplicaciones intensivas de cómputo e intensivas de memoria. Este análisis se llevó a cabo a través del estudio de los parámetros de ubicación entre los cores, la necesidad de usar la memoria compartida, el tamaño de la carga de entrada, la distribución de los datos dentro del procesador y la granularidad de trabajo. Nuestro objetivo es identificar cómo estos parámetros influyen en la eficiencia de la ejecución, identificar cuellos de botella y proponer posibles mejoras. Otra propuesta es adaptar las estrategias ya utilizadas por el Scheduler con el fin de obtener mejores resultados.
Resumo:
In the recent years most libraries have focused on mass digitization programs and keeping electronic born documents, showing and organizing them in a repository. While those repositories have evolved to a much more manageable systems focusing on the user expectations and introducing web 2.0 tools, digital preservation is still in the to-do list of most of them. There is quite a lot of studies focused on preservation and some complex models exist, unfortunately, very few practical systems are running and its quite difficult for a library to get involved in a solution already tested by others. The CBUC (Consortium of University Catalan Libraries) runs TDX, an ETD repository now keeping more than 10.000 full text thesis from any of the 12 university members. After 10 years running TDX a solid preservation system was needed to ensure every thesis would be kept as it was regardless what happens to the repository. The perfect solution was found in the MetaArchive cooperative, this is the effort of many insitutions to keep a copy of each other content through a newtwork using the LOCKSS software as a mechanism to keep track of any change. The presentation will shortly introduce what TDX and MetaArchive is but will, in a practical way, show how the LOCKSS network for presrervation works. Finally a summary of the benefits of the overall experience will be shown.