44 resultados para Standard map


Relevância:

20.00% 20.00%

Publicador:

Resumo:

State and regional policies, such as low carbon fuel standards (LCFSs), increasingly mandate that transportation fuels be examined according to their greenhouse gas (GHG) emissions. We investigate whether such policies benefit from determining fuel carbon intensities (FCIs) locally to account for variations in fuel production and to stimulate improvements in FCI. In this study, we examine the FCI of transportation fuels on a lifecycle basis within a specific state, Minnesota, and compare the results to FCIs using national averages. Using data compiled from 18 refineries over an 11-year period, we find that ethanol production is highly variable, resulting in a 42% difference between carbon intensities. Historical data suggests that lower FCIs are possible through incremental improvements in refining efficiency and the use of biomass for processing heat. Stochastic modeling of the corn ethanol FCI shows that gains in certainty due to knowledge of specific refinery inputs are overwhelmed by uncertainty in parameters external to the refiner, including impacts of fertilization and land use change. The LCA results are incorporated into multiple policy scenarios to demonstrate the effect of policy configurations on the use of alternative fuels. These results provide a contrast between volumetric mandates and LCFSs. © 2011 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss solvability issues of H_-/H_2/infinity optimal fault detection problems in the most general setting. A solution approach is presented which successively reduces the initial problem to simpler ones. The last computational step generally may involve the solution of a non-standard H_-/H_2/infinity optimization problem for which we discuss possible solution approaches. Using an appropriate definition of the H- index, we provide a complete solution of this problem in the case of H2-norm. Furthermore, we discuss the solvability issues in the case of H-infinity-norm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introducing a "Cheaper, Faster, Better" product in today's highly competitive market is a challenging target. Therefore, for organizations to improve their performance in this area, they need to adopt methods such as process modelling, risk mitigation and lean principles. Recently, several industries and researchers focused efforts on transferring the value orientation concept to other phases of the Product Life Cycle (PLC) such as Product Development (PD), after its evident success in manufacturing. In PD, value maximization, which is the main objective of lean theory, has been of particular interest as an improvement concept that can enhance process flow logistics and support decision-making. This paper presents an ongoing study of the current understanding of value thinking in PD (VPD) with a focus on value dimensions and implementation benefits. The purpose of this study is to consider the current state of knowledge regarding value thinking in PD, and to propose a definition of value and a framework for analyzing value delivery. The framework-named the Value Cycle Map (VCM)- intends to facilitate understanding of value and its delivery mechanism in the context of the PLC. We suggest the VCM could be used as a foundation for future research in value modelling and measurement in PD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss solvability issues of ℍ -/ℍ 2/∞ optimal fault detection problems in the most general setting. A solution approach is presented which successively reduces the initial problem to simpler ones. The last computational step generally may involve the solution of a non-standard ℍ -/ ℍ 2/∞ optimization problem for which we discuss possible solution approaches. Using an appropriate definition of the ℍ -- index, we provide a complete solution of this problem in the case of ℍ 2-norm. Furthermore, we discuss the solvability issues in the case of ℍ ∞-norm. © 2011 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three-dimensional (3-D) spatial data of a transportation infrastructure contain useful information for civil engineering applications, including as-built documentation, on-site safety enhancements, and progress monitoring. Several techniques have been developed for acquiring 3-D point coordinates of infrastructure, such as laser scanning. Although the method yields accurate results, the high device costs and human effort required render the process infeasible for generic applications in the construction industry. A quick and reliable approach, which is based on the principles of stereo vision, is proposed for generating a depth map of an infrastructure. Initially, two images are captured by two similar stereo cameras at the scene of the infrastructure. A Harris feature detector is used to extract feature points from the first view, and an innovative adaptive window-matching technique is used to compute feature point correspondences in the second view. A robust algorithm computes the nonfeature point correspondences. Thus, the correspondences of all the points in the scene are obtained. After all correspondences have been obtained, the geometric principles of stereo vision are used to generate a dense depth map of the scene. The proposed algorithm has been tested on several data sets, and results illustrate its potential for stereo correspondence and depth map generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three-dimensional (3-D) spatial data of a transportation infrastructure contain useful information for civil engineering applications, including as-built documentation, on-site safety enhancements, and progress monitoring. Several techniques have been developed for acquiring 3-D point coordinates of infrastructure, such as laser scanning. Although the method yields accurate results, the high device costs and human effort required render the process infeasible for generic applications in the construction industry. A quick and reliable approach, which is based on the principles of stereo vision, is proposed for generating a depth map of an infrastructure. Initially, two images are captured by two similar stereo cameras at the scene of the infrastructure. A Harris feature detector is used to extract feature points from the first view, and an innovative adaptive window-matching technique is used to compute feature point correspondences in the second view. A robust algorithm computes the nonfeature point correspondences. Thus, the correspondences of all the points in the scene are obtained. After all correspondences have been obtained, the geometric principles of stereo vision are used to generate a dense depth map of the scene. The proposed algorithm has been tested on several data sets, and results illustrate its potential for stereo correspondence and depth map generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Language models (LMs) are often constructed by building multiple individual component models that are combined using context independent interpolation weights. By tuning these weights, using either perplexity or discriminative approaches, it is possible to adapt LMs to a particular task. This paper investigates the use of context dependent weighting in both interpolation and test-time adaptation of language models. Depending on the previous word contexts, a discrete history weighting function is used to adjust the contribution from each component model. As this dramatically increases the number of parameters to estimate, robust weight estimation schemes are required. Several approaches are described in this paper. The first approach is based on MAP estimation where interpolation weights of lower order contexts are used as smoothing priors. The second approach uses training data to ensure robust estimation of LM interpolation weights. This can also serve as a smoothing prior for MAP adaptation. A normalized perplexity metric is proposed to handle the bias of the standard perplexity criterion to corpus size. A range of schemes to combine weight information obtained from training data and test data hypotheses are also proposed to improve robustness during context dependent LM adaptation. In addition, a minimum Bayes' risk (MBR) based discriminative training scheme is also proposed. An efficient weighted finite state transducer (WFST) decoding algorithm for context dependent interpolation is also presented. The proposed technique was evaluated using a state-of-the-art Mandarin Chinese broadcast speech transcription task. Character error rate (CER) reductions up to 7.3 relative were obtained as well as consistent perplexity improvements. © 2012 Elsevier Ltd. All rights reserved.