917 resultados para Dynamic Emission Modeling
Resumo:
This work identifies the limitations of n-way data analysis techniques in multidimensional stream data, such as Internet chat room communications data, and establishes a link between data collection and performance of these techniques. Its contributions are twofold. First, it extends data analysis to multiple dimensions by constructing n-way data arrays known as high order tensors. Chat room tensors are generated by a simulator which collects and models actual communication data. The accuracy of the model is determined by the Kolmogorov-Smirnov goodness-of-fit test which compares the simulation data with the observed (real) data. Second, a detailed computational comparison is performed to test several data analysis techniques including svd [1], and multi-way techniques including Tucker1, Tucker3 [2], and Parafac [3].
Resumo:
This study explored the dynamic performance of an innovative Hybrid Composite Floor Plate System (HCFPS), composed of Polyurethane (PU) core, outer layers of Glass–fibre Reinforced Cement (GRC) and steel laminates at tensile regions, using experimental testing and Finite Element (FE) modelling. Experimental testing included heel impact and walking tests for 3200 mm span HCFPS panels. FE models of the HCFPS were developed using the FE program ABAQUS and validated with experimental results. HCFPS is a light-weight high frequency floor system with excellent damping ratio of 5% (bare floor) due to the central PU core. Parametric studies were conducted using the validated FE models to investigate the dynamic response of the HCFPS and to identify characteristics that influence acceleration response under human induced vibration in service. This vibration performance was compared with recommended acceptable perceptibility limits. The findings of this study show that HCFPS can be used in residential and office buildings as a light-weight floor system, which does not exceed the perceptible thresholds due to human induced vibrations.
Resumo:
Construction practitioners often experience unexpected results of their scheduling-related decisions. This is mainly due to lack of understanding of the dynamic nature of construction system. However, very little attention has been given to its significant importance and few empirical studies have been undertaken on this issue. This paper, therefore, analyzes the effect of aggressive scheduling, overtime, resource adding, and schedule slippage on construction performance, focusing on workers’ reactions to those scheduling decisions. Survey data from 102 construction practitioners in 38 construction sites are used for the analysis. The results indicate that efforts to increase work rate by working overtime, resource adding, and aggressive scheduling can be offset due to losses in productivity and quality. Based on the research findings, practical guidelines are then discussed to help site managers to effectively deal with the dynamics of scheduling and improve construction performance.
Resumo:
Several approaches have been introduced in the literature for active noise control (ANC) systems. Since the filtered-x least-mean-square (FxLMS) algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of the FxLMS algorithm, as a first novelty. In many ANC applications, an on-line secondary path modeling method using white noise as a training signal is required to ensure convergence of the system. As a second novelty, this paper proposes a new approach for on-line secondary path modeling on the basis of a new variable-step-size (VSS) LMS algorithm in feed forward ANC systems. The proposed algorithm is designed so that the noise injection is stopped at the optimum point when the modeling accuracy is sufficient. In this approach, a sudden change in the secondary path during operation makes the algorithm reactivate injection of the white noise to re-adjust the secondary path estimate. Comparative simulation results shown in this paper indicate the effectiveness of the proposed approach in reducing both narrow-band and broad-band noise. In addition, the proposed ANC system is robust against sudden changes of the secondary path model.
Resumo:
The objective of this research was to investigate the effects of driving conditions and suspension parameters on dynamic load-sharing of longitudinal-connected air suspensions of a tri-axle semi-trailer. A novel nonlinear model of a multi-axle semi-trailer with longitudinal-connected air suspension was formulated based on fluid mechanics and thermodynamics and was validated through test results. The effects of driving conditions and suspension parameters on dynamic load-sharing and road-friendliness of the semi-trailer were analyzed. Simulation results indicate that the road-friendliness metric-DLC (dynamic load coefficient) is not always in accordance with the load-sharing metric-DLSC (dynamic load-sharing coefficient). The effect of employing larger air lines and connectors on the DLSC optimization ratio gives varying results as road roughness increases and as driving speed increases. When the vehicle load reduces, or the static pressure increases, the DLSC optimization ratio declines monotonically. The results also indicate that if the air line diameter is always assumed to be larger than the connector diameter, the influence of air line diameter on load-sharing is more significant than that of the connector.
Resumo:
Graphene has promised many novel applications in nanoscale electronics and sustainable energy due to its novel electronic properties. Computational exploration of electronic functionality and how it varies with architecture and doping presently runs ahead of experimental synthesis yet provides insights into types of structures that may prove profitable for targeted experimental synthesis and characterization. We present here a summary of our understanding on the important aspects of dimension, band gap, defect, and interfacial engineering of graphene based on state-of-the-art ab initio approaches. Some most recent experimental achievements relevant for future theoretical exploration are also covered.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.
Resumo:
The main aim of this paper is to describe an adaptive re-planning algorithm based on a RRT and Game Theory to produce an efficient collision free obstacle adaptive Mission Path Planner for Search and Rescue (SAR) missions. This will provide UAV autopilots and flight computers with the capability to autonomously avoid static obstacles and No Fly Zones (NFZs) through dynamic adaptive path replanning. The methods and algorithms produce optimal collision free paths and can be integrated on a decision aid tool and UAV autopilots.
Resumo:
In March 2008, the Australian Government announced its intention to introduce a national Emissions Trading Scheme (ETS), now expected to start in 2015. This impending development provides an ideal setting to investigate the impact an ETS in Australia will have on the market valuation of Australian Securities Exchange (ASX) firms. This is the first empirical study into the pricing effects of the ETS in Australia. Primarily, we hypothesize that firm value will be negatively related to a firm's carbon intensity profile. That is, there will be a greater impact on firm value for high carbon emitters in the period prior (2007) to the introduction of the ETS, whether for reasons relating to the existence of unbooked liabilities associated with future compliance and/or abatement costs, or for reasons relating to reduced future earnings. Using a sample of 58 Australian listed firms (constrained by the current availability of emissions data) which comprise larger, more profitable and less risky listed Australian firms, we first undertake an event study focusing on five distinct information events argued to impact the probability of the proposed ETS being enacted. Here, we find direct evidence that the capital market is indeed pricing the proposed ETS. Second, using a modified version of the Ohlson (1995) valuation model, we undertake a valuation analysis designed not only to complement the event study results, but more importantly to provide insights into the capital market's assessment of the magnitude of the economic impact of the proposed ETS as reflected in market capitalization. Here, our results show that the market assesses the most carbon intensive sample firms a market value decrement relative to other sample firms of between 7% and 10% of market capitalization. Further, based on the carbon emission profile of the sample firms we imply a ‘future carbon permit price’ of between AUD$17 per tonne and AUD$26 per tonne of carbon dioxide emitted. This study is more precise than industry reports, which set a carbon price of between AUD$15 to AUD$74 per tonne.
Resumo:
Two different morphologies of nanotextured molybdenum oxide were deposited by thermal evaporation. By measuring their field emission (FE) properties, an enhancement factor was extracted. Subsequently, these films were coated with a thin layer of Pt to form Schottky contacts. The current-voltage (I-V) characteristics showed low magnitude reverse breakdown voltages, which we attributed to the localized electric field enhancement. An enhancement factor was obtained from the I-V curves. We will show that the enhancement factor extracted from the I-V curves is in good agreement with the enhancement factor extracted from the FE measurements.
Resumo:
It is widely recognised that defining trade-offs between greenhouse gas emissions using ‘emission equivalence’ based on global warming potentials (GWPs) referenced to carbon dioxide produces anomalous results when applied to methane. The short atmospheric lifetime of methane, compared to the timescales of CO2 uptake, leads to the greenhouse warming depending strongly on the temporal pattern of emission substitution. We argue that a more appropriate way to consider the relationship between the warming effects of methane and carbon dioxide is to define a ‘mixed metric’ that compares ongoing methane emissions (or reductions) to one-off emissions (or reductions) of carbon dioxide. Quantifying this approach, we propose that a one-off sequestration of 1 t of carbon would offset an ongoing methane emission in the range 0.90–1.05 kg CH4 per year. We present an example of how our approach would apply to rangeland cattle production, and consider the broader context of mitigation of climate change, noting the reverse trade-off would raise significant challenges in managing the risk of non-compliance. Our analysis is consistent with other approaches to addressing the criticisms of GWP-based emission equivalence, but provides a simpler and more robust approach while still achieving close equivalence of climate mitigation outcomes ranging over decadal to multi-century timescales.
Resumo:
The success of contemporary organizations depends on their ability to make appropriate decisions. Making appropriate decisions is inevitably bound to the availability and provision of relevant information. Information systems should be able to provide information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Syperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of specifying effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings. A short example illustrates the usefulness of a conceptual data modeling technique for the specification of information systems.
Resumo:
With the growing importance of IS for organizations and the continuous stream of new IT developments, the IS function in organizations becomes more critical but is also more challenged. For example, how should the IS function deal with the consumerization of IT or what is the added value of the IS function when it comes to SaaS and the Cloud? In this paper we argue that IS research is in need of a dynamic perspective on the IS function. The IS function has to become more focused on building and adapting IS capabilities in a changing environment. We discuss that there has been an overreliance on the Resource Based View so far for understanding the IS function and capabilities and introduce Dynamic Capabilities Theory as an additional theoretical perspective, which has only been limitedly addressed in IS literature yet. We present a first conceptualization of the dynamic IS function and discuss IS capabilities frameworks and individual IS capabilities from a dynamic perspective. These initial insights demonstrate the contribution of a dynamic perspective on the IS function itself.
Resumo:
In a business environment, making the right decisions is vital for the success of a company. Making right decisions is inevitably bound to the availability and provision of relevant information. Information systems are supposed to be able to provide this information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Szyperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of developing effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings.
Resumo:
This paper proposes a new method for online secondary path modeling in feedback active noise control (ANC) systems. In practical cases, the secondary path is usually time varying. For these cases, online modeling of secondary path is required to ensure convergence of the system. In literature the secondary path estimation is usually performed offline, prior to online modeling, where in the proposed system there is no need for using offline estimation. The proposed method consists of two steps: a noise controller which is based on an FxLMS algorithm, and a variable step size (VSS) LMS algorithm which is used to adapt the modeling filter with the secondary path. In order to increase performance of the algorithm in a faster convergence and accurate performance, we stop the VSS-LMS algorithm at the optimum point. The results of computer simulation shown in this paper indicate effectiveness of the proposed method.