38 resultados para lead-time structure
Resumo:
The current operational very short-term and short-term quantitative precipitation forecast (QPF) at the Meteorological Service of Catalonia (SMC) is made by three different methodologies: Advection of the radar reflectivity field (ADV), Identification, tracking and forecasting of convective structures (CST) and numerical weather prediction (NWP) models using observational data assimilation (radar, satellite, etc.). These precipitation forecasts have different characteristics, lead time and spatial resolutions. The objective of this study is to combine these methods in order to obtain a single and optimized QPF at each lead time. This combination (blending) of the radar forecast (ADV and CST) and precipitation forecast from NWP model is carried out by means of different methodologies according to the prediction horizon. Firstly, in order to take advantage of the rainfall location and intensity from radar observations, a phase correction technique is applied to the NWP output to derive an additional corrected forecast (MCO). To select the best precipitation estimation in the first and second hour (t+1 h and t+2 h), the information from radar advection (ADV) and the corrected outputs from the model (MCO) are mixed by using different weights, which vary dynamically, according to indexes that quantify the quality of these predictions. This procedure has the ability to integrate the skill of rainfall location and patterns that are given by the advection of radar reflectivity field with the capacity of generating new precipitation areas from the NWP models. From the third hour (t+3 h), as radar-based forecasting has generally low skills, only the quantitative precipitation forecast from model is used. This blending of different sources of prediction is verified for different types of episodes (convective, moderately convective and stratiform) to obtain a robust methodology for implementing it in an operational and dynamic way.
Resumo:
Debido a la necesidad de diferenciarse y hacer frente a la competencia, las empresas han apostado por desarrollar operaciones que den valor al cliente, por eso muchas de ellas han visto en las herramientas lean la oportunidad para mejorar sus operaciones. Esta mejora implica la reducción de dinero, personas, equipos grandes, inventario y espacio, con dos objetivos: eliminar despilfarro y reducir la variabilidad. Para conseguir los objetivos estratégicos de la empresa es imprescindible qué éstos estén alineados con los planes de la gerencia a nivel medio y a su vez con el trabajo realizado por los empleados para asegurar que cada persona está alineada en la misma dirección y al mismo tiempo. Ésta es la filosofía de la planificación estratégica. Por ello uno de los objetivos de este proyecto será el desarrollar una herramienta que facilite la exposición de los objetivos de la empresa y la comunicación de los mismos a todos los niveles de la organización para a partir de ellos y tomando como referencia la necesidad de reducir inventarios en la cadena de suministro se realizará un estudio de la producción de un componente de control del aerogenerador para conseguir nivelarla y reducir su inventario de producto terminado. Los objetivos particulares en este apartado serán reducir el inventario en un 28%, nivelar la producción reduciendo la variabilidad del 31% al 24%, mantener un stock máximo de 24 unidades garantizando el suministro ante una demanda variable, incrementar la rotación del inventario en un 10% y establecer un plan de acción para reducir el lead time entre un 40-50%. Todo ello será posible gracias a la realización del mapa de valor presente y futuro para eliminar desperdicios y crear un flujo continuo y el cálculo de un supermercado que mantenga el stock en un nivel óptimo.
Resumo:
Debido a la necesidad de diferenciarse y hacer frente a la competencia, las empresas han apostado por desarrollar operaciones que den valor al cliente, por eso muchas de ellas han visto en las herramientas lean la oportunidad para mejorar sus operaciones. Esta mejora implica la reducción de dinero, personas, equipos grandes, inventario y espacio, con dos objetivos: eliminar despilfarro y reducir la variabilidad. Para conseguir los objetivos estratégicos de la empresa es imprescindible qué éstos estén alineados con los planes de la gerencia a nivel medio y a su vez con el trabajo realizado por los empleados para asegurar que cada persona está alineada en la misma dirección y al mismo tiempo. Ésta es la filosofía de la planificación estratégica. Por ello uno de los objetivos de este proyecto será el desarrollar una herramienta que facilite la exposición de los objetivos de la empresa y la comunicación de los mismos a todos los niveles de la organización para a partir de ellos y tomando como referencia la necesidad de reducir inventarios en la cadena de suministro se realizará un estudio de la producción de un componente de control del aerogenerador para conseguir nivelarla y reducir su inventario de producto terminado. Los objetivos particulares en este apartado serán reducir el inventario en un 28%, nivelar la producción reduciendo la variabilidad del 31% al 24%, mantener un stock máximo de 24 unidades garantizando el suministro ante una demanda variable, incrementar la rotación del inventario en un 10% y establecer un plan de acción para reducir el lead time entre un 40-50%. Todo ello será posible gracias a la realización del mapa de valor presente y futuro para eliminar desperdicios y crear un flujo continuo y el cálculo de un supermercado que mantenga el stock en un nivel óptimo.
Resumo:
The Cherenkov light flashes produced by Extensive Air Showers are very short in time. A high bandwidth and fast digitizing readout, therefore, can minimize the influence of the background from the light of the night sky, and improve the performance in Cherenkov telescopes. The time structure of the Cherenkov image can further be used in single-dish Cherenkov telescopes as an additional parameter to reduce the background from unwanted hadronic showers. A description of an analysis method which makes use of the time information and the subsequent improvement on the performance of the MAGIC telescope (especially after the upgrade with an ultra fast 2 GSamples/s digitization system in February 2007) will be presented. The use of timing information in the analysis of the new MAGIC data reduces the background by a factor two, which in turn results in an enhancement of about a factor 1.4 of the flux sensitivity to point-like sources, as tested on observations of the Crab Nebula.
Resumo:
Spatio-temporal variability in settlement and recruitment, high mortality during the first life-history stages, and selection may determine the genetic structure of cohorts of long-lived marine invertebrates at small scales. We conducted a spatial and temporal analysis of the common Mediterranean Sea urchin Paracentrotus lividus to determine the genetic structure of cohorts at different scales. In Tossa de Mar (NW Mediterranean), recruitment was followed over 5 consecutive springs (2006-2010). In spring 2008, recruits and two-year-old individuals were collected at 6 locations along East and South Iberian coasts separated from 200 to over 1,100 km. All cohorts presented a high genetic diversity based on a fragment of mtCOI. Our results showed a marked genetic homogeneity in the temporal monitoring and a low degree of spatial structure in 2006. In 2008, coupled with an abnormality in the usual circulation patterns in the area, the genetic structure of the southern populations studied changed markedly, with arrival of many private haplotypes. This fact highlights the importance of point events in renewing the genetic makeup of populations, which can only be detected through analysis of the cohort structure coupling temporal and spatial perspectives.
Resumo:
Transketolase is an enzyme involved in a critical step of the non-oxidative branch of the pentose phosphate pathway whose inhibition could lead to new anticancer drugs. Here, we report new human transketolase inhibitors, based on the phenyl urea scaffold, found by applying structure-based virtual screening. These inhibitors are designed to cover a hot spot in the dimerization interface of the homodimer of the enzyme, providing for the first time compounds with a suggested novel binding mode not based on mimicking the thiamine pyrophosphate cofactor.
Resumo:
This paper is the first to examine the implications of switching to PT work for women's subsequent earnings trajectories, distinguishing by their type of contract: permanent or fixedterm. Using a rich longitudinal Spanish data set from Social Security records of over 76,000 prime-aged women strongly attached to the Spanish labor market, we find that PT work aggravates the segmentation of the labor market insofar there is a PT pay penalty and this penalty is larger and more persistent in the case of women with fixed-term contracts. The paper discusses problems arising in empirical estimation (including a problem not discussed in the literature up to now: the differential measurement error of the LHS variable by PT status), and how to address them. It concludes with policy implications relevant for Continental Europe and its dual structure of employment protection.
Resumo:
In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.
Resumo:
L’Ària amb diverses variacions BWV 988, quarta part dels Clavier-übung, de Johann Sebastian Bach, és una obra que ha estat objecte de molts estudis. La seva extrema bellesa, el sobrenom amb què avui la coneixem, Variacions Goldberg -a partir de la llegenda que explica Forkel sobre un comte que patia insomni i el seu clavecinista Goldberg- i la seva estructura interna tan perfectament calculada, alhora que generadora de grans preguntes complexes de respondre, fan d’aquesta obra un mite indiscutible de la literatura per a teclat que, des de fa més 250 anys, ens continua fascinant i la fa immortal. Aquest treball és una humil aproximació a aquests aspectes, que ens faran acostar-nos al món interior d’aquesta obra.
Resumo:
This paper shows how instructors can use the problem‐based learning method to introduce producer theory and market structure in intermediate microeconomics courses. The paper proposes a framework where different decision problems are presented to students, who are asked to imagine that they are the managers of a firm who need to solve a problem in a particular business setting. In this setting, the instructors’ role isto provide both guidance to facilitate student learning and content knowledge on a just‐in‐time basis
Resumo:
One of the criticisms leveled at the model of dispersed city found all over the world is its unarticulated, random, and undifferentiated nature. To check this idea in the Barcelona Metropolitan Region, we estimated the impact of the urban spatial structure (CBD, subcenters and transportation infrastructures) over the population density and commuting distance. The results are unfavorable to the hypothesis of the increasing destructuring of cities given that the explanatory capacity of both functions improves over time, both when other control variables are not included and when they are included.
Resumo:
The classical wave-of-advance model of the neolithic transition (i.e., the shift from hunter-gatherer to agricultural economies) is based on Fisher's reaction-diffusion equation. Here we present an extension of Einstein's approach to Fickian diffusion, incorporating reaction terms. On this basis we show that second-order terms in the reaction-diffusion equation, which have been neglected up to now, are not in fact negligible but can lead to important corrections. The resulting time-delayed model agrees quite well with observations
Resumo:
In a number of programs for gene structure prediction in higher eukaryotic genomic sequences, exon prediction is decoupled from gene assembly: a large pool of candidate exons is predicted and scored from features located in the query DNA sequence, and candidate genes are assembled from such a pool as sequences of nonoverlapping frame-compatible exons. Genes are scored as a function of the scores of the assembled exons, and the highest scoring candidate gene is assumed to be the most likely gene encoded by the query DNA sequence. Considering additive gene scoring functions, currently available algorithms to determine such a highest scoring candidate gene run in time proportional to the square of the number of predicted exons. Here, we present an algorithm whose running time grows only linearly with the size of the set of predicted exons. Polynomial algorithms rely on the fact that, while scanning the set of predicted exons, the highest scoring gene ending in a given exon can be obtained by appending the exon to the highest scoring among the highest scoring genes ending at each compatible preceding exon. The algorithm here relies on the simple fact that such highest scoring gene can be stored and updated. This requires scanning the set of predicted exons simultaneously by increasing acceptor and donor position. On the other hand, the algorithm described here does not assume an underlying gene structure model. Indeed, the definition of valid gene structures is externally defined in the so-called Gene Model. The Gene Model specifies simply which gene features are allowed immediately upstream which other gene features in valid gene structures. This allows for great flexibility in formulating the gene identification problem. In particular it allows for multiple-gene two-strand predictions and for considering gene features other than coding exons (such as promoter elements) in valid gene structures.
Resumo:
Recent decisions by the Spanish national competition authority (TDC) mandate payment systems to include only two costs when setting their domestic multilateral interchange fees (MIF): a fixed processing cost and a variable cost for the risk of fraud. This artificial lowering of MIFs will not lower consumer prices, because of uncompetitive retailing; but it will however lead to higher cardholders fees and, likely, new prices for point of sale terminals, delaying the development of the immature Spanish card market. Also, to the extent that increased cardholders fees do not offset the fall in MIFs revenue, the task of issuing new cards will be underpaid relatively to the task of acquiring new merchants, causing an imbalance between the two sides of the networks. Moreover, the pricing scheme arising from the decisions will cause unbundling and underprovision of those services whose costs are excluded. Indeed, the payment guarantee and the free funding period will tend to be removed from the package of services currently provided, to be either provided by third parties, by issuers for a separate fee, or not provided at all, especially to smaller and medium-sized merchants. Transaction services will also suffer the consequences that the TDC precludes pricing them in variable terms.