918 resultados para Optimal Sampling Time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address a real world scheduling problem concerning the repair process of aircrafts’ engines by TAP - Maintenance & Engineering (TAP-ME). TAP-ME is the maintenance, repair and overhaul organization of TAP Portugal, Portugal’s leading airline, which employs about 4000 persons to provide maintenance and engineering services in aircraft, engines and components. TAP-ME is aiming to optimize its maintenance services, focusing on the reduction of the engines repair turnaround time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent changes of paradigm in power systems opened the opportunity to the active participation of new players. The small and medium players gain new opportunities while participating in demand response programs. This paper explores the optimal resources scheduling in two distinct levels. First, the network operator facing large wind power variations makes use of real time pricing to induce consumers to meet wind power variations. Then, at the consumer level, each load is managed according to the consumer preferences. The two-level resources schedule has been implemented in a real-time simulation platform, which uses hardware for consumer’ loads control. The illustrative example includes a situation of large lack of wind power and focuses on a consumer with 18 loads.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose the Distributed using Optimal Priority Assignment (DOPA) heuristic that finds a feasible partitioning and priority assignment for distributed applications based on the linear transactional model. DOPA partitions the tasks and messages in the distributed system, and makes use of the Optimal Priority Assignment (OPA) algorithm known as Audsley’s algorithm, to find the priorities for that partition. The experimental results show how the use of the OPA algorithm increases in average the number of schedulable tasks and messages in a distributed system when compared to the use of Deadline Monotonic (DM) usually favoured in other works. Afterwards, we extend these results to the assignment of Parallel/Distributed applications and present a second heuristic named Parallel-DOPA (P-DOPA). In that case, we show how the partitioning process can be simplified by using the Distributed Stretch Transformation (DST), a parallel transaction transformation algorithm introduced in [1].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric temperatures characterize Earth as a slow dynamics spatiotemporal system, revealing long-memory and complex behavior. Temperature time series of 54 worldwide geographic locations are considered as representative of the Earth weather dynamics. These data are then interpreted as the time evolution of a set of state space variables describing a complex system. The data are analyzed by means of multidimensional scaling (MDS), and the fractional state space portrait (fSSP). A centennial perspective covering the period from 1910 to 2012 allows MDS to identify similarities among different Earth’s locations. The multivariate mutual information is proposed to determine the “optimal” order of the time derivative for the fSSP representation. The fSSP emerges as a valuable alternative for visualizing system dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An infinite-horizon discrete time model with multiple size-class structures using a transition matrix is built to assess optimal harvesting schedules in the context of Non-Industrial Private Forest (NIPF) owners. Three model specifications accounting for forest income, financial return on an asset and amenity valuations are considered. Numerical simulations suggest uneven-aged forest management where a rational forest owner adapts her or his forest policy by influencing the regeneration of trees or adjusting consumption dynamics depending on subjective time preference and market return rate dynamics on the financial asset. Moreover she or he does not value significantly non-market benefits captured by amenity valuations relatively to forest income.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cell-based approaches in tissue engineering (TE) have been barely explored for the treatment of tendon and ligament (T/L) tissues, requiring the establishment of a widely available cell source with tenogenic potential. As T/L cells are scarce, stem cells may provide a good alternative. Understanding how resident cells behave in vitro, might be useful for recapitulating the tenogenic potential of stem cells for tendon TE applications. Therefore, we propose to isolate and characterize human T/L-derived cells (hTDCs and hLDCs) and compare their regenerative potential with stem cells from adipose tissue (hASCs) and amniotic fluid (hAFSCs)(1). T/L cells were isolated using different procedures and stem cells isolated as described elsewhere(1). Moreover, T/L cells were stimu- lated into the three mesenchymal lineages, using standard differentia- tion media. Cells were characterized for the typical stem cell markers as well as T/L related markers, namely tenascin-C, collagen I and III, decorin and scleraxis, using different complementary techniques such as real time RT-PCR, immunocytochemistry and flow cytometry. No differences were observed between T/L in gene expression and protein deposition. T/L cells were mostly positive for stem ness markers (CD73/CD90/CD105), and have the potential to differentiate towards osteogenesis, chondrogenesis and adipogenesis, demonstrated by the positive staining for AlizarinRed, SafraninO, ToluidineBlue and OilRed. hASCs and hAFSCs exhibit positive expression of all tenogenic mark- ers, although at lower levels than hTDCs and hLDCs. Nevertheless, stem cells availability is key factor in TE strategies, despite that it’s still required optimization to direct their tenogenic phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La contaminación ambiental por metales pesados como el cromo y por compuestos orgánicos como los fenoles es un grave problema a nivel mundial debido a su toxicidad y a sus efectos adversos sobre los seres humanos, la flora y la fauna, tanto por su acumulación en la cadena alimentaria como por su continua persistencia en el medio ambiente. En un estudio preliminar, efectuado por nuestro laboratorio, se han detectado elevados niveles de estos contaminantes en sedimentos y efluentes en zonas industriales del sur de la provincia de Córdoba, lo cual plantea la necesidad de removerlos. Entre las tecnologías disponibles, la biorremediación, que se basa en el uso de sistemas biológicos, como los microorganismos, para la detoxificación y la degradación de contaminantes, se presenta como una alternativa probablemente más efectiva y de menor costo que las técnicas convencionales. Sin embargo, la aplicación de esta tecnología depende en gran parte de la influencia de las características particulares y específicas de la zona a remediar. En consecuencia, en primer lugar se caracterizará la zona de muestreo y se aislarán e identificarán microorganismos nativos de la región, tolerantes a cromo y fenol, a partir de muestras de suelo, agua y sedimentos, ya que podrían constituir una adecuada herramienta biotecnológica, mejor adaptada al sitio a tratar. Posteriormente se estudiará la biorremediación de Cr y fenol utilizando dichos microorganismos, analizando su capacidad para biotransformar, bioacumular o bioadsorber a estos contaminantes, y se determinarán las condiciones óptimas para el tratamiento. Se analizarán los posibles mecanismos fisiológicos, bioquímicos y moleculares involucrados en la remediación, que constituye una etapa crucial para el diseño de una estrategia adecuada y eficiente. Finalmente, se aplicará esta tecnología a escala reactor, como una primera aproximación al tratamiento a mayor escala. De esta manera se espera reducir los niveles de estos contaminantes y así minimizar el impacto ambiental que ellos producen en suelos y acuíferos. A futuro, la utilización de los microorganismos seleccionados, de manera individual o formando consorcios, para el tratamiento de efluentes industriales previa liberación al medio ambiente, o su uso en bioaumento, constituirían posibles alternativas de aplicación. Los principales impactos científico-tecnológicos del proyecto serán: (a) la generación de una nueva tecnología biológica de decontaminación de cromo y fenol, intentando presentar soluciones frente a una problemática ambiental que afecta a nuestra región, pero que además es común a la mayoría de los países, (b) la formación de nuevos recursos humanos en el área y (c) el trabajo en colaboración con otros grupos de investigación que se destacan en el área de biotecnología ambiental. Environmental pollution produced by heavy metals, such as chromium and organic compounds like phenolics is a serious global problem due to their toxicity, their adverse effects on human life, plants and animals, their accumulation in the food chains and also by their persistance in the environment. In a previous study performed in our laboratory, high levels of these pollutants were detected in sediments and effluents from industrial zones of the south of Cordoba Province, which determine the need to remove them. Among various technologies, bioremediation which is based on the use of biological systems, such as microorganisms, to detoxify and to degrade contaminants, is probably the most effective alternative, and it is less expensive than other conventional technologies. However, the application of this technology depends on the influence of the particular and specific characteristics of the zone to be remediate. As a consecuence, at the first time, the zone of sampling will be characterized and then, native microorganisms, tolerant to chromium and phenol, will be isolated from soils, water and sediments and identificated. These microorganisms would be an adequate biotechnological tool, more adapted to the conditions of the site to be remediate than other ones. Then, the ability of these selected microorganisms to biotransform, bioaccumulate or biosorbe chromium and phenol will be studied and the optimal conditions for the treatment will be determined. The possible physiological, biochemical and molecular mechanisms involved in bioremediation will be also analized, because this is a crucial step in the design of an adequate and efficient remediation strategy. Finally, this technology will be applied in a reactor, as an approximation to the treatment at a major scale. A reduction in the levels of these pollutants will be expected, to minimize their environmental impact on soils and aquifers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work describes the spatial-temporal variation of the relative abundance and size of Limnoperna fortunei (Dunker, 1857) collected in São Gonçalo Channel through bottom trawl with a 0.5 cm mesh, at depths between 3 and 6 m. The estimative of mean relative abundance (CPUE) ranged from 2,425.3 individuals per drag (ind./drag) in the spring to 21,715.0 ind./drag in the fall, with an average of 9,515.3 ind./drag throughout the year. The estimated mean density of L. fortunei for the deep region of São Gonçalo Channel ranged from 1.2 to 10.3 ind./m², and it was recorded a maximum density of 84.9 ind./m² in the fall of 2008. The method of sampling using bottom trawl enabled the capture of L. fortunei under the soft muddy bottom of the channel, in different sizes ranging from 0.4 to 3.2 cm. This shows that the structure of the L. fortunei adult population under the bottom of the São Gonçalo Channel is composed mostly of small individuals (<1.4 cm), which represent up to 74% of the population collected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To determine in influence of feeding, lighting and time of day on the copulating behavior of Panstrongylus megistus, 480 insect pairs were divided into four groups of 120 each and tested in the following respective situations: without food deprivation (F.D.), with five days of F.D., with ten days of F.D., and with 20 days of F. D. The tests were performed between 9:00 a.m. to 12:00a.m. and 7:00 p.m. to 10:00 p.m., with light (700-1400 lux) and in the dark (1.4-2.8 lux) and behavior was recorded by the time sampling technique. Mating spped (MS) and duration of copulation (DC) were also calculated for each situation. The maximum frequency of copulation was observed after five days of F.D., at night, in the dark (n = 16), and the minimum was observed for recently-fed pairs, at night, with light (n = 4). Males approached females more often than females approached males. MS was lowest in pairs with twenty days of F.D., at night, with light (X = 23.0 ± 16.0 minutes), and highest in recently-fed pairs, during the day, with light (X = 2.9 ± 2.5 minutes). DC was shortest in recently-fed insects, during the day, in the dark (X = 23.5 ± 6.7 minutes), and longest in recently-fed animals, at night, in the dark (X = 38.3 ± 6.9 minutes).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to correct and improve the results obtained by Van der Ploeg (1984a, 1984b) and utilized in the theoretical literature related to feedback stochastic optimal control sensitive to constant exogenous risk-aversion (see, Jacobson, 1973, Karp, 1987 and Whittle, 1981, 1989, 1990, among others) or to the classic context of risk-neutral decision-makers (see, Chow, 1973, 1976a, 1976b, 1977, 1978, 1981, 1993). More realistic and attractive, this new approach is placed in the context of a time-varying endogenous risk-aversion which is under the control of the decision-maker. It has strong qualitative implications on the agent's optimal policy during the entire planning horizon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction/objectives: Multipatient use of a single-patient CBSD occurred inan outpatient clinic during 4 to 16 months before itsnotification. We looked for transmission of blood-bornepathogens among exposed patients.Methods: Exposed patients underwent serology testing for HBV,HCV and HIV. Patients with isolated anti-HBc receivedone dose of hepatitis B vaccine to look for a memoryimmune response. Possible transmissions were investigatedby mapping visits and sequencing of the viral genomeif needed.Results: Of 280 exposed patients, 9 had died without suspicionof blood-borne infection, 3 could not be tested, and 5declined investigations. Among the 263 (93%) testedpatients, 218 (83%) had negative results. We confirmeda known history of HCV infection in 6 patients (1 coinfectedby HIV), and also identified resolved HBVinfection in 37 patients, of whom 18 were alreadyknown. 2 patients were found to have a previouslyunknown HCV infection. According to the time elapsedfrom the closest previous visit of a HCV-infected potentialsource patient, we could rule out nosocomial transmissionin one case (14 weeks) but not in the other (1day). In the latter, however, transmission was deemedvery unlikely by 2 reference centers based on thesequences of the E1 and HVR1 regions of the virus.Conclusion: We did not identify any transmission of blood-bornepathogens in 263 patients exposed to a single-patientCBSD, despite the presence of potential source cases.Change of needle and disinfection of the device betweenpatients may have contributed to this outcome.Although we cannot exclude transmission of HBV, previousacquisition in endemic countries is a more likelyexplanation in this multi-national population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper studies the aggregate and distributional implications of Markov-perfect tax-spending policy in a neoclassical growth model with capitalists and workers. Focusing on the long run, our main fi ndings are: (i) it is optimal for a benevolent government, which cares equally about its citizens, to tax capital heavily and to subsidise labour; (ii) a Pareto improving means to reduce ine¢ ciently high capital taxation under discretion is for the government to place greater weight on the welfare of capitalists; (iii) capitalists and workers preferences, regarding the optimal amount of "capitalist bias", are not aligned implying a conflict of interests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we make three contributions to the literature on optimal Competition Law enforcement procedures. The first (which is of general interest beyond competition policy) is to clarify the concept of “legal uncertainty”, relating it to ideas in the literature on Law and Economics, but formalising the concept through various information structures which specify the probability that each firm attaches – at the time it takes an action – to the possibility of its being deemed anti-competitive were it to be investigated by a Competition Authority. We show that the existence of Type I and Type II decision errors by competition authorities is neither necessary nor sufficient for the existence of legal uncertainty, and that information structures with legal uncertainty can generate higher welfare than information structures with legal certainty – a result echoing a similar finding obtained in a completely different context and under different assumptions in earlier Law and Economics literature (Kaplow and Shavell, 1992). Our second contribution is to revisit and significantly generalise the analysis in our previous paper, Katsoulacos and Ulph (2009), involving a welfare comparison of Per Se and Effects- Based legal standards. In that analysis we considered just a single information structure under an Effects-Based standard and also penalties were exogenously fixed. Here we allow for (a) different information structures under an Effects-Based standard and (b) endogenous penalties. We obtain two main results: (i) considering all information structures a Per Se standard is never better than an Effects-Based standard; (ii) optimal penalties may be higher when there is legal uncertainty than when there is no legal uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate a New Keynesian DSGE model for the Euro area under alternative descriptions of monetary policy (discretion, commitment or a simple rule) after allowing for Markov switching in policy maker preferences and shock volatilities. This reveals that there have been several changes in Euro area policy making, with a strengthening of the anti-inflation stance in the early years of the ERM, which was then lost around the time of German reunification and only recovered following the turnoil in the ERM in 1992. The ECB does not appear to have been as conservative as aggregate Euro-area policy was under Bundesbank leadership, and its response to the financial crisis has been muted. The estimates also suggest that the most appropriate description of policy is that of discretion, with no evidence of commitment in the Euro-area. As a result although both ‘good luck’ and ‘good policy’ played a role in the moderation of inflation and output volatility in the Euro-area, the welfare gains would have been substantially higher had policy makers been able to commit. We consider a range of delegation schemes as devices to improve upon the discretionary outcome, and conclude that price level targeting would have achieved welfare levels close to those attained under commitment, even after accounting for the existence of the Zero Lower Bound on nominal interest rates.