818 resultados para Efficient lighting


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forest fires are a serious threat to humans and nature from an ecological, social and economic point of view. Predicting their behaviour by simulation still delivers unreliable results and remains a challenging task. Latest approaches try to calibrate input variables, often tainted with imprecision, using optimisation techniques like Genetic Algorithms. To converge faster towards fitter solutions, the GA is guided with knowledge obtained from historical or synthetical fires. We developed a robust and efficient knowledge storage and retrieval method. Nearest neighbour search is applied to find the fire configuration from knowledge base most similar to the current configuration. Therefore, a distance measure was elaborated and implemented in several ways. Experiments show the performance of the different implementations regarding occupied storage and retrieval time with overly satisfactory results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The implementation of public programs to support business R&D projects requires the establishment of a selection process. This selection process faces various difficulties, which include the measurement of the impact of the R&D projects as well as selection process optimization among projects with multiple, and sometimes incomparable, performance indicators. To this end, public agencies generally use the peer review method, which, while presenting some advantages, also demonstrates significant drawbacks. Private firms, on the other hand, tend toward more quantitative methods, such as Data Envelopment Analysis (DEA), in their pursuit of R&D investment optimization. In this paper, the performance of a public agency peer review method of project selection is compared with an alternative DEA method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the last two decades there has been an increase in using dynamic tariffs for billing household electricity consumption. This has questioned the suitability of traditional pricing schemes, such as two-part tariffs, since they contribute to create marked peak and offpeak demands. The aim of this paper is to assess if two-part tariffs are an efficient pricing scheme using Spanish household electricity microdata. An ordered probit model with instrumental variables on the determinants of power level choice and non-paramentric spline regressions on the electricity price distribution will allow us to distinguish between the tariff structure choice and the simultaneous demand decisions. We conclude that electricity consumption and dwellings’ and individuals’ characteristics are key determinants of the fixed charge paid by Spanish households Finally, the results point to the inefficiency of the two-part tariff as those consumers who consume more electricity pay a lower price than the others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper revisits the problem of adverse selection in the insurance market of Rothschild and Stiglitz [28]. We propose a simple extension of the game-theoretic structure in Hellwig [14] under which Nash-type strategic interaction between the informed customers and the uninformed firms results always in a particular separating equilibrium. The equilibrium allocation is unique and Pareto-efficient in the interim sense subject to incentive-compatibility and individual rationality. In fact, it is the unique neutral optimum in the sense of Myerson [22].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a frictional two-sided matching market in which one side uses public cheap talk announcements so as to attract the other side. We show that if the first-price auction is adopted as the trading protocol, then cheap talk can be perfectly informative, and the resulting market outcome is efficient, constrained only by search frictions. We also show that the performance of an alternative trading protocol in the cheap-talk environment depends on the level of price dispersion generated by the protocol: If a trading protocol compresses (spreads) the distribution of prices relative to the first-price auction, then an efficient fully revealing equilibrium always (never) exists. Our results identify the settings in which cheap talk can serve as an efficient competitive instrument, in the sense that the central insights from the literature on competing auctions and competitive search continue to hold unaltered even without ex ante price commitment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innate immune responses play a central role in neuroprotection and neurotoxicity during inflammatory processes that are triggered by pathogen-associated molecular pattern-exhibiting agents such as bacterial lipopolysaccharide (LPS) and that are modulated by inflammatory cytokines such as interferon γ (IFNγ). Recent findings describing the unexpected complexity of mammalian genomes and transcriptomes have stimulated further identification of novel transcripts involved in specific physiological and pathological processes, such as the neural innate immune response that alters the expression of many genes. We developed a system for efficient subtractive cloning that employs both sense and antisense cRNA drivers, and coupled it with in-house cDNA microarray analysis. This system enabled effective direct cloning of differentially expressed transcripts, from a small amount (0.5 µg) of total RNA. We applied this system to isolation of genes activated by LPS and IFNγ in primary-cultured cortical cells that were derived from newborn mice, to investigate the mechanisms involved in neuroprotection and neurotoxicity in maternal/perinatal infections that cause various brain injuries including periventricular leukomalacia. A number of genes involved in the immune and inflammatory response were identified, showing that neonatal neuronal/glial cells are highly responsive to LPS and IFNγ. Subsequent RNA blot analysis revealed that the identified genes were activated by LPS and IFNγ in a cooperative or distinctive manner, thereby supporting the notion that these bacterial and cellular inflammatory mediators can affect the brain through direct but complicated pathways. We also identified several novel clones of apparently non-coding RNAs that potentially harbor various regulatory functions. Characterization of the presently identified genes will give insights into mechanisms and interventions not only for perinatal infection-induced brain damage, but also for many other innate immunity-related brain disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astrocytes are now considered as key players in brain information processing because of their newly discovered roles in synapse formation and plasticity, energy metabolism and blood flow regulation. However, our understanding of astrocyte function is still fragmented compared to other brain cell types. A better appreciation of the biology of astrocytes requires the development of tools to generate animal models in which astrocyte-specific proteins and pathways can be manipulated. In addition, it is becoming increasingly evident that astrocytes are also important players in many neurological disorders. Targeted modulation of protein expression in astrocytes would be critical for the development of new therapeutic strategies. Gene transfer is valuable to target a subpopulation of cells and explore their function in experimental models. In particular, viral-mediated gene transfer provides a rapid, highly flexible and cost-effective, in vivo paradigm to study the impact of genes of interest during central nervous system development or in adult animals. We will review the different strategies that led to the recent development of efficient viral vectors that can be successfully used to selectively transduce astrocytes in the mammalian brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compared the influence of the bug density in the capacity of Triatoma infestans and Panstrongylus megistus in obtaining blood meal in non anaesthetized mice. The regression anlysis for increase in body weight (mg) versus density (no. of bugs/mouse) showed that in experiments with anaesthetized mice (AM), no correlation was observed. In experiments with non anaesthetized mice (NAM) the weight increase was inversely proportional to density. The regression slope for blood meal size on density was less steep for T. infestans than for P. megistus (-1.9 and -3.0, respectively). The average weight increase of P. megistus nymphus in experiments with AM was higher than for T. infestans nymphs; however, in experiments with NAM such results were inverted. Mortality of P. megistus was significantly higher than of T. infestans with NAM. However, in experiments with AM very low mortality was observed. Considering the mortality and the slope of regression line on NAM, T. infestans is more efficient than P. megistus in obtaining blood meal in similar densities, possibly because it caused less irritation of the mice. The better exploitation of blood source of T. infestans when compared with P. megistus in similar densities, favours the maintenance of a better nutritional status in higher densities. This could explain epidemiological findings in which T. infestans not only succeeds in establishing larger colonies but also dislodges P. megistus in human dwellings when it is introduced in areas where the latter species prevails.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the use of probabilistic or randomized algorithms for solving combinatorial optimization problems. Our approach employs non-uniform probability distributions to add a biased random behavior to classical heuristics so a large set of alternative good solutions can be quickly obtained in a natural way and without complex conguration processes. This procedure is especially useful in problems where properties such as non-smoothness or non-convexity lead to a highly irregular solution space, for which the traditional optimization methods, both of exact and approximate nature, may fail to reach their full potential. The results obtained are promising enough to suggest that randomizing classical heuristics is a powerful method that can be successfully applied in a variety of cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The UHPLC strategy which combines sub-2 microm porous particles and ultra-high pressure (>1000 bar) was investigated considering very high resolution criteria in both isocratic and gradient modes, with mobile phase temperatures between 30 and 90 degrees C. In isocratic mode, experimental conditions to reach the maximal efficiency were determined using the kinetic plot representation for DeltaP(max)=1000 bar. It has been first confirmed that the molecular weight of the compounds (MW) was a critical parameter which should be considered in the construction of such curves. With a MW around 1000 g mol(-1), efficiencies as high as 300,000 plates could be theoretically attained using UHPLC at 30 degrees C. By limiting the column length to 450 mm, the maximal plate count was around 100,000. In gradient mode, the longest column does not provide the maximal peak capacity for a given analysis time in UHPLC. This was attributed to the fact that peak capacity is not only related to the plate number but also to column dead time. Therefore, a compromise should be found and a 150 mm column should be preferentially selected for gradient lengths up to 60 min at 30 degrees C, while the columns coupled in series (3x 150 mm) were attractive only for t(grad)>250 min. Compared to 30 degrees C, peak capacities were increased by about 20-30% for a constant gradient length at 90 degrees C and gradient time decreased by 2-fold for an identical peak capacity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational resources. Grid enables access to the resources but it does not guarantee any quality of service. Moreover, Grid does not provide performance isolation; job of one user can influence the performance of other user’s job. The other problem with Grid is that the users of Grid belong to scientific community and the jobs require specific and customized software environment. Providing the perfect environment to the user is very difficult in Grid for its dispersed and heterogeneous nature. Though, Cloud computing provide full customization and control, but there is no simple procedure available to submit user jobs as in Grid. The Grid computing can provide customized resources and performance to the user using virtualization. A virtual machine can join the Grid as an execution node. The virtual machine can also be submitted as a job with user jobs inside. Where the first method gives quality of service and performance isolation, the second method also provides customization and administration in addition. In this thesis, a solution is proposed to enable virtual machine reuse which will provide performance isolation with customization and administration. The same virtual machine can be used for several jobs. In the proposed solution customized virtual machines join the Grid pool on user request. Proposed solution describes two scenarios to achieve this goal. In first scenario, user submits their customized virtual machine as a job. The virtual machine joins the Grid pool when it is powered on. In the second scenario, user customized virtual machines are preconfigured in the execution system. These virtual machines join the Grid pool on user request. Condor and VMware server is used to deploy and test the scenarios. Condor supports virtual machine jobs. The scenario 1 is deployed using Condor VM universe. The second scenario uses VMware-VIX API for scripting powering on and powering off of the remote virtual machines. The experimental results shows that as scenario 2 does not need to transfer the virtual machine image, the virtual machine image becomes live on pool more faster. In scenario 1, the virtual machine runs as a condor job, so it easy to administrate the virtual machine. The only pitfall in scenario 1 is the network traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The role of the Saccharomyces cerevisae peroxisomal acyl-coenzyme A (acyl-CoA) thioesterase (Pte1p) in fatty acid beta-oxidation was studied by analyzing the in vitro kinetic activity of the purified protein as well as by measuring the carbon flux through the beta-oxidation cycle in vivo using the synthesis of peroxisomal polyhydroxyalkanoate (PHA) from the polymerization of the 3-hydroxyacyl-CoAs as a marker. The amount of PHA synthesized from the degradation of 10-cis-heptadecenoic, tridecanoic, undecanoic, or nonanoic acids was equivalent or slightly reduced in the pte1Delta strain compared with wild type. In contrast, a strong reduction in PHA synthesized from heptanoic acid and 8-methyl-nonanoic acid was observed for the pte1Delta strain compared with wild type. The poor catabolism of 8-methyl-nonanoic acid via beta-oxidation in pte1Delta negatively impacted the degradation of 10-cis-heptadecenoic acid and reduced the ability of the cells to efficiently grow in medium containing such fatty acids. An increase in the proportion of the short chain 3-hydroxyacid monomers was observed in PHA synthesized in pte1Delta cells grown on a variety of fatty acids, indicating a reduction in the metabolism of short chain acyl-CoAs in these cells. A purified histidine-tagged Pte1p showed high activity toward short and medium chain length acyl-CoAs, including butyryl-CoA, decanoyl-CoA and 8-methyl-nonanoyl-CoA. The kinetic parameters measured for the purified Pte1p fit well with the implication of this enzyme in the efficient metabolism of short straight and branched chain fatty acyl-CoAs by the beta-oxidation cycle.