105 resultados para Free oil
Resumo:
This paper studies optimal monetary policy in a framework that explicitly accounts for policymakers' uncertainty about the channels of transmission of oil prices into the economy. More specfically, I examine the robust response to the real price of oil that US monetary authorities would have been recommended to implement in the period 1970 2009; had they used the approach proposed by Cogley and Sargent (2005b) to incorporate model uncertainty and learning into policy decisions. In this context, I investigate the extent to which regulator' changing beliefs over different models of the economy play a role in the policy selection process. The main conclusion of this work is that, in the specific environment under analysis, one of the underlying models dominates the optimal interest rate response to oil prices. This result persists even when alternative assumptions on the model's priors change the pattern of the relative posterior probabilities, and can thus be attributed to the presence of model uncertainty itself.
Resumo:
This paper addresses the issue of policy evaluation in a context in which policymakers are uncertain about the effects of oil prices on economic performance. I consider models of the economy inspired by Solow (1980), Blanchard and Gali (2007), Kim and Loungani (1992) and Hamilton (1983, 2005), which incorporate different assumptions on the channels through which oil prices have an impact on economic activity. I first study the characteristics of the model space and I analyze the likelihood of the different specifications. I show that the existence of plausible alternative representations of the economy forces the policymaker to face the problem of model uncertainty. Then, I use the Bayesian approach proposed by Brock, Durlauf and West (2003, 2007) and the minimax approach developed by Hansen and Sargent (2008) to integrate this form of uncertainty into policy evaluation. I find that, in the environment under analysis, the standard Taylor rule is outperformed under a number of criteria by alternative simple rules in which policymakers introduce persistence in the policy instrument and respond to changes in the real price of oil.
Resumo:
This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.
Resumo:
In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.
Resumo:
The decision to publish educational materials openly and under free licenses brings up the challenge of doing it in a sustainable way. Some lessons can be learned from the business models for production, maintenance and distribution of Free and Open Source Software. The Free Technology Academy (FTA) has taken on these challenges and has implemented some of these models. We briefly review the FTA educational programme, methodologies and organisation, and see to which extent these models are proving successful in the case of the FTA.
Resumo:
Open educational resource (OER) initiatives have made the shift from being a fringe activity to one that is increasingly considered as a key component in both teaching and learning in higher education and in the fulfilment of universities' mission and goals. Although the reduction in the cost of materials is often cited as a potential benefit of OER, this potential benefit has not yet been realised in practice necessitating thoughtful consideration of various strategies for new OER initiatives such as the OpenContent directory at the University of Cape Town (UCT) in South Africa.This paper reviews the range of sustainability strategies mentioned in the literature, plots the results of a small-scale OER sustainability survey against these strategies and explains how these findings and other papers on OER initiatives were used to inform an in-house workshop at UCT to deliberate the future strategy for the sustainability of OER at UCT.
Resumo:
In this project a research both in finding predictors via clustering techniques and in reviewing the Data Mining free software is achieved. The research is based in a case of study, from where additionally to the KDD free software used by the scientific community; a new free tool for pre-processing the data is presented. The predictors are intended for the e-learning domain as the data from where these predictors have to be inferred are student qualifications from different e-learning environments. Through our case of study not only clustering algorithms are tested but also additional goals are proposed.
Resumo:
Aquest article s'aplica un enfocament comparatiu de la gestió del sector petrolier en dos països rics en petroli a l'Orient Mitjà: Aràbia Saudita i els Emirats Àrabs Units (EAU). Més explícitament, s'examinen els factors que millor expliquen la variació en les estratègies que han vingut aplicant en el sector des de l'establiment de les seves empreses petrolieres nacionals (NOC). Recorrent a la literatura sobre expropiació / privatització en el camp d'Estudis de l'Energia, en aquest treball es proposa un marc teòric per analitzar la lògica que hi ha al darrera de les diferents formes d'exploració i les estratègies de producció en el sector petrolier. Teòricament, aquesta investigació té com a objectiu avançar en un conjunt d'eines d'anàlisi per abordar millor i entendre els determinants reals del procés de presa de decisions en el upstream. Empíricament, el model es prova en els casos divergents d'Aràbia Saudita i els Emirats Àrabs Units, i una sèrie de lliçons s'han extret. En última instància, aquest treball ha d'augmentar la conscienciació entre els erudits i els polític, tant sobre el poc coneixement que tenim sobre els mecanismes interns que impulsen les estratègies ascendents en el majors estats productor de petroli.
Resumo:
This paper describes a systematic research about free software solutions and techniques for art imagery computer recognition problem.
Resumo:
The availability of induced pluripotent stem cells (iPSCs)has created extraordinary opportunities for modeling andperhaps treating human disease. However, all reprogrammingprotocols used to date involve the use of products of animal origin. Here, we set out to develop a protocol to generate and maintain human iPSC that would be entirelydevoid of xenobiotics. We first developed a xeno-free cellculture media that supported the long-term propagation of human embryonic stem cells (hESCs) to a similar extent as conventional media containing animal origin products or commercially available xeno-free medium. We also derivedprimary cultures of human dermal fibroblasts under strictxeno-free conditions (XF-HFF), and we show that they can be used as both the cell source for iPSC generation as well as autologous feeder cells to support their growth. We also replaced other reagents of animal origin trypsin, gelatin, matrigel) with their recombinant equivalents. Finally, we used vesicular stomatitis virus G-pseudotyped retroviral particles expressing a polycistronic construct encoding Oct4, Sox2, Klf4, and GFP to reprogram XF-HFF cells under xeno-free conditions. A total of 10 xeno-free humaniPSC lines were generated, which could be continuously passaged in xeno-free conditions and aintained characteristics indistinguishable from hESCs, including colonymorphology and growth behavior, expression of pluripotency-associated markers, and pluripotent differentiationability in vitro and in teratoma assays. Overall, the resultspresented here demonstrate that human iPSCs can be generatedand maintained under strict xeno-free conditions and provide a path to good manufacturing practice (GMP) applicability that should facilitate the clinical translation of iPSC-based therapies.
Characterization of human gene expression changes after olive oil ingestion: an exploratory approach
Resumo:
Olive oil consumption is protective against risk factors for cardiovascular and cancer diseases. A nutrigenomic approach was performed to assess whether changes in gene expression could occur in human peripheral blood mononuclear cells after oli ve oil ingestion at postprandial state. Six healthy male volunteers ingested, at fasting state, 50 ml of olive oil. Prior to intervention a 1-week washout period with a controlled diet and sunflower oil as the only source of fat was followed. During the 3 days before and on the intervention day, a very low-phenolic compound diet was followed. At baseline (0 h) and at post-ingestion (6 h), total RNA was isolated and gene expression (29,082 genes) was evaluated by microarray. From microarray data, nutrient-gene interactions were observed in genes related to metabolism, cellular processes, cancer, and atherosclerosis (e.g. USP48 by 2.16; OGT by 1.68-fold change) and associated processes such as inflammation (e.g. AKAP13 by 2.30; IL-10 by 1.66-fold change) and DNA damage (e.g. DCLRE1C by 1.47; POLK by 1.44- fold change). When results obtained by microarray were verified by qRT-PCR in nine genes, full concordance was achieved only in the case of up-regulated genes. Changes were observed at a real-life dose of olive oil, as it is daily consumed in some Mediterranean areas. Our results support the hypothesis that postprandial protective changes related to olive oil consumption could be mediated through gene expression changes.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
This paper describes a Computer-Supported Collaborative Learning (CSCL) case study in engineering education carried out within the context of a network management course. The case study shows that the use of two computing tools developed by the authors and based on Free- and Open-Source Software (FOSS) provide significant educational benefits over traditional engineering pedagogical approaches in terms of both concepts and engineering competencies acquisition. First, the Collage authoring tool guides and supports the course teacher in the process of authoring computer-interpretable representations (using the IMS Learning Design standard notation) of effective collaborative pedagogical designs. Besides, the Gridcole system supports the enactment of that design by guiding the students throughout the prescribed sequence of learning activities. The paper introduces the goals and context of the case study, elaborates onhow Collage and Gridcole were employed, describes the applied evaluation methodology, anddiscusses the most significant findings derived from the case study.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Using a new dataset on capital account openness, we investigate why equity return correlations changed over the last century. Based on a new, long-run dataset on capital account regulations in a group of 16 countries over the period 1890-2001, we show that correlations increase as financial markets are liberalized. These findings are robust to controlling for both the Forbes-Rigobon bias and global averages in equity return correlations. We test the robustness of our conclusions, and show that greater synchronization of fundamentals is not the main cause of increasing correlations. These results imply that the home bias puzzle may be smaller than traditionally claimed.