917 resultados para Model Construction and Estimation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conceptually, the management of safety at roadworks can be seen in a three level framework. At the regulatory level, roadworks operate at the interface between the work environment, governed by workplace health and safety regulations, and the road environment, which is subject to road traffic regulations and practices. At the organizational level, national, state and local governments plan and purchase road construction and maintenance which are then delivered in-house or tendered out to large construction companies who often subcontract multiple smaller companies to supply services and labor. At the operational level, roadworks are difficult to isolate from the general public, hindering effective occupational health and safety controls. This study, from the State of Queensland, Australia, examines how well this tripartite framework functions. It includes reviews of organizational policy and procedures documents; interviews with 24 subject matter experts from various road construction and maintenance organizations, and on-site interviews with 66 road construction personnel. The study identified several factors influencing the translation of safety policies into practice including the cost of safety measures in the context of competitive tendering, lack of firm evidence of the effectiveness of safety measures, and pressures to minimize disruption to the travelling public.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment , should be appropriately modelled in order to create the user profiles [1]. Secondly, the semantics behind the tags should be considered properly as the flexibility with their design can cause semantic problems such as synonymy and polysemy [2]. This research proposes to address these two challenges for building a tag-based item recommendation system by employing tensor modeling as the multi-dimensional user profile approach, and the topic model as the semantic analysis approach. The first objective is to optimize the tensor model reconstruction and to improve the model performance in generating quality rec-ommendation. A novel Tensor-based Recommendation using Probabilistic Ranking (TRPR) method [3] has been developed. Results show this method to be scalable for large datasets and outperforming the benchmarking methods in terms of accuracy. The memory efficient loop implements the n-mode block-striped (matrix) product for tensor reconstruction as an approximation of the initial tensor. The probabilistic ranking calculates the probabil-ity of users to select candidate items using their tag preference list based on the entries generated from the reconstructed tensor. The second objective is to analyse the tag semantics and utilize the outcome in building the tensor model. This research proposes to investigate the problem using topic model approach to keep the tags nature as the “social vocabulary” [4]. For the tag assignment data, topics can be generated from the occurrences of tags given for an item. However there is only limited amount of tags availa-ble to represent items as collection of topics, since an item might have only been tagged by using several tags. Consequently, the generated topics might not able to represent the items appropriately. Furthermore, given that each tag can belong to any topics with various probability scores, the occurrence of tags cannot simply be mapped by the topics to build the tensor model. A standard weighting technique will not appropriately calculate the value of tagging activity since it will define the context of an item using a tag instead of a topic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper new online adaptive hidden Markov model (HMM) state estimation schemes are developed, based on extended least squares (ELS) concepts and recursive prediction error (RPE) methods. The best of the new schemes exploit the idempotent nature of Markov chains and work with a least squares prediction error index, using a posterior estimates, more suited to Markov models then traditionally used in identification of linear systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structural equation modeling (SEM) is a versatile multivariate statistical technique, and applications have been increasing since its introduction in the 1980s. This paper provides a critical review of 84 articles involving the use of SEM to address construction related problems over the period 1998–2012 including, but not limited to, seven top construction research journals. After conducting a yearly publication trend analysis, it is found that SEM applications have been accelerating over time. However, there are inconsistencies in the various recorded applications and several recurring problems exist. The important issues that need to be considered are examined in research design, model development and model evaluation and are discussed in detail with reference to current applications. A particularly important issue concerns the construct validity. Relevant topics for efficient research design also include longitudinal or cross-sectional studies, mediation and moderation effects, sample size issues and software selection. A guideline framework is provided to help future researchers in construction SEM applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to review, critique and develop a research agenda for the Elaboration Likelihood Model (ELM). The model was introduced by Petty and Cacioppo over three decades ago and has been modified, revised and extended. Given modern communication contexts, it is appropriate to question the model’s validity and relevance. Design/methodology/approach: The authors develop a conceptual approach, based on a fully comprehensive and extensive review and critique of ELM and its development since its inception. Findings: This paper focuses on major issues concerning the ELM. These include model assumptions and its descriptive nature; continuum questions, multi-channel processing and mediating variables before turning to the need to replicate the ELM and to offer recommendations for its future development. Research limitations/implications: This paper offers a series of questions in terms of research implications. These include whether ELM could or should be replicated, its extension, a greater conceptualization of argument quality, an explanation of movement along the continuum and between central and peripheral routes to persuasion, or to use new methodologies and technologies to help better understanding consume thinking and behaviour? All these relate to the current need to explore the relevance of ELM in a more modern context. Practical implications: It is time to question the validity and relevance of the ELM. The diversity of on- and off-line media options and the variants of consumer choice raise significant issues. Originality/value: While the ELM model continues to be widely cited and taught as one of the major cornerstones of persuasion, questions are raised concerning its relevance and validity in 21st century communication contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nature of the transport system contributes to public health outcomes in a range of ways. The clearest contribution to public health is in the area of traffic crashes, because of their direct impact on individual death and disability and their direct costs to the health system. Other papers in this conference address these issues. This paper outlines some collaborative research between the Centre for Accident Research and Road Safety - Queensland (CARRS-Q) at QUT and Chinese researchers in areas that have indirect health impacts. Heavy vehicle dynamics: The integrity of the road surface influences crash risk, with ruts, pot-holes and other forms of road damage contributing to increased crash risks. The great majority of damage to the road surface from vehicles is caused by heavy trucks and buses, rather than cars or smaller vehicles. In some cases this damage is due to deliberate overloading, but in other cases it is due to vehicle suspension characteristics that lead to occasional high loads on particular wheels. Together with a visiting researcher and his colleagues, we have used both Queensland and Chinese data to model vehicle suspension systems that reduce the level of load, and hence the level of road damage and resulting crash risk(1-5). Toll worker exposure to vehicle emissions: The increasing construction of highways in China has also involved construction of a large number of toll roads. Tollbooth workers are potentially exposed to high levels of pollutants from vehicles, however the extent of this exposure and how it relates to standards for exposure are not well known. In a study led by a visiting researcher, we conducted a study to model these levels of exposure for a tollbooth in China(6). Noise pollution: The increasing presence of high speed roads in China has contributed to an increase in noise levels. In this collaborative study we modelled noise levels associated with a freeway widening near a university campus, and measures to reduce the noise(7). Along with these areas of research, there are many other areas of transport with health implications that are worthy of exploration. Traffic, noise and pollution contribute to a difficult environment for pedestrians, especially in an ageing society where there are health benefits to increasing physical activity. By building on collaborations such as those outlined, there is potential for a contribution to improved public health by addressing transport issues such as vehicle factors and pollution, and extending the research to other areas of travel activity. 1. Chen, Y., He, J., King, M., Chen, W. and Zhang, W. (2014). Stiffness-damping matching method of an ECAS system based on LQG control. Journal of Central South University, 21:439-446. DOI: 10.1007/s1177101419579 2. Chen, Y., He, J., King, M., Feng, Z. and Chang, W. (2013). Comparison of two suspension control strategies for multi-axle heavy truck. Journal of Central South University, 20(2): 550-562. 3. Chen, Y., He, J., King, M., Chen, W. and Zhang, W. (2013). Effect of driving conditions and suspension parameters on dynamic load-sharing of longitudinal-connected air suspensions. Science China Technological Sciences, 56(3): 666-676. DOI: 10.1007/s11431-012-5091-3 4. Chen, Y., He., J., King, M., Chen, W. and Zhang, W. (2013). Model development and dynamic load-sharing analysis of longitudinal-connected air suspensions. Strojniški Vestnik - Journal of Mechanical Engineering, 59(1):14-24. 5. Chen, Y., He, J., King, M., Liu, H. and Zhang, W. (2013). Dynamic load-sharing of longitudinal-connected air suspensions of a tri-axle semi-trailer. Proceedings of Transportation Research Board Annual Conference, Washington DC, 13-17 January 2013, paper no. 13-1117. 6. He, J., Qi, Z., Hang, W., King, M., and Zhao, C. (2011). Numerical evaluation of pollutant dispersion at a toll plaza based on system dynamics and Computational Fluid Dynamics models. Transportation Research Part C, 19(2011):510-520. 7. Zhang, C., He, J., Wang, Z., Yin, R. and King, M. (2013). Assessment of traffic noise level before and after freeway widening using traffic microsimulation and a refined classic noise prediction method. Proceedings of Transportation Research Board Annual Conference, Washington DC, 13-17 January 2013, paper no. 13-2016.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The construction industry accounts for a significant portion of the material consumption of our industrialised societies. That material consumption comes at an environmental cost, and when buildings and infrastructure projects are demolished and discarded, after their useful lifespan, that environmental cost remains largely unrecovered. The expected operational lifespan of modern buildings has become disturbingly short as buildings are replaced for reasons of changing cultural expectations, style, serviceability, locational obsolescence and economic viability. The same buildings however are not always physically or structurally obsolete; the materials and components within them are very often still completely serviceable. While there is some activity in the area of recycling of selected construction materials, such as steel and concrete, this is almost always in the form of down cycling or reprocessing. Very little of this material and component resource is reuse in a way that more effectively captures its potential. One significant impediment to such reuse is that buildings are not designed in a way that facilitates easy recovery of materials and components; they are designed and built for speed of construction and quick economic returns, with little or no consideration of the longer term consequences of their physical matter. This research project explores the potential for the recovery of materials and components if buildings were designed for such future recovery; a strategy of design for disassembly. This is not a new design philosophy; design for disassembly is well understood in product design and industrial design. There are also some architectural examples of design for disassembly; however these are specialist examples and there is no significant attempt to implement the strategy in the main stream construction industry. This paper presents research into the analysis of the embodied energy in buildings, highlighting its significance in comparison with operational energy. Analysis at material, component, and whole-of-building levels shows the potential benefits of strategically designing buildings for future disassembly to recover this embodied energy. Careful consideration at the early design stage can result in the deconstruction of significant portions of buildings and the recovery of their potential through higher order reuse and upcycling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers’ peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers’ location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price,managed supply, etc., in a conceptual ‘map’ of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tick box interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the locations and highlighted that household numbers, demographics as well as the different climates were significant factors. It presented possible network peak demand reductions which would delay any upgrade of networks, resulting in savings for Queensland utilities and ultimately for households. The use of this systems approach using Bayesian networks to assist the management of peak demand in different modelled locations in Queensland provided insights about the most important elements in the system and the intervention strategies that could be tailored to the targeted customer segments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This workshop comprised a diverse group of African construction experts, ranging far wider than RSA. Each of the attendees had attended the annual ASOCSA conference and was additionally provided with a short workshop pre-brief. The aim was to develop a view of their 15-20 year vision of construction improvement in RSA and the steps necessary to get there. These included sociological, structural, technical and process changes. Whilst some suggestions are significantly challenging, none are impossible, given sufficient collaboration between government, industry, academia and NGOs. The highest priority projects (more properly, programmes) were identified and further explored. These are: 1. Information Hub (‘Open Africa’). Aim – to utilise emerging trends in Open Data to provide a force for African unity. 2. Workforce Development. Aim – to rebuild a competent, skilled construction industry for RSA projects and for export. 3. Modular DIY Building. Aim – to accelerate the development of sustainable, cost-efficient and desirable housing for African economic immigrants and others living in makeshift and slum dwellings. Open Data is a maturing theme in different cities and governments around the world and the workshop attendees were very keen to seize such a possibility to assist in developing an environment where Africans can share information and foster collaboration. It is likely that NGOs might be keen to follow up such an initiative. There are significant developments taking place around the world in the construction sector currently, with comparatively large savings being made for taxpayers (20% plus in the UK). Not all of these changes would be easy to transplant to RSA (even more so to much of the rest of Africa). Workforce development was a keen plea amongst the attendees, who seemed concerned that expertise has leaked away and is not being replaced with sufficient intensity. It is possible today to develop modular buildings in such a way that even unskilled residents can assist in their construction, and even their appropriate design. These buildings can be sited nearly autonomously from infrastructures, thus relieving the tensions on cities and townships, whilst providing humane accommodation for the economically disadvantaged. Development of suitable solutions could either be conducted with other similarly stressed countries or developed in-country and the expertise exported. Finally, it should be pointed out that this was very much a first step. Any opportunity to collaborate from an Australian, QUT or CIB perspective would be welcomed, whilst acknowledging that the leading roles belong to RSA, CSIR, NRF, ASOCSA and the University of KwaZulu-Natal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider estimating the total load from frequent flow data but less frequent concentration data. There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates that minimizes the biases and makes use of informative predictive variables. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized rating-curve approach with additional predictors that capture unique features in the flow data, such as the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and the discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. Forming this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach for two rivers delivering to the Great Barrier Reef, Queensland, Australia. One is a data set from the Burdekin River, and consists of the total suspended sediment (TSS) and nitrogen oxide (NO(x)) and gauged flow for 1997. The other dataset is from the Tully River, for the period of July 2000 to June 2008. For NO(x) Burdekin, the new estimates are very similar to the ratio estimates even when there is no relationship between the concentration and the flow. However, for the Tully dataset, by incorporating the additional predictive variables namely the discounted flow and flow phases (rising or recessing), we substantially improved the model fit, and thus the certainty with which the load is estimated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates by minimizing the biases and making use of possible predictive variables. The load estimation procedure can be summarized by the following four steps: - (i) output the flow rates at regular time intervals (e.g. 10 minutes) using a time series model that captures all the peak flows; - (ii) output the predicted flow rates as in (i) at the concentration sampling times, if the corresponding flow rates are not collected; - (iii) establish a predictive model for the concentration data, which incorporates all possible predictor variables and output the predicted concentrations at the regular time intervals as in (i), and; - (iv) obtain the sum of all the products of the predicted flow and the predicted concentration over the regular time intervals to represent an estimate of the load. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized regression (rating-curve) approach with additional predictors that capture unique features in the flow data, namely the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and cumulative discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. The model also has the capacity to accommodate autocorrelation in model errors which are the result of intensive sampling during floods. Incorporating this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach using the concentrations of total suspended sediment (TSS) and nitrogen oxide (NOx) and gauged flow data from the Burdekin River, a catchment delivering to the Great Barrier Reef. The sampling biases for NOx concentrations range from 2 to 10 times indicating severe biases. As we expect, the traditional average and extrapolation methods produce much higher estimates than those when bias in sampling is taken into account.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Joseph Brodsky, one of the most influential Russian intellectuals of the late Soviet period, was born in Leningrad in 1940, emigrated to the United States in 1972, received the Nobel Prize for Literature in 1987, and died in New York City in 1996. Brodsky was one of the leading public figures of Soviet emigration in the Cold War period, and his role as a model for the constructing of Russian cultural identities in the last years of the Soviet Union was, and still is, extremely important. One of Joseph Brodsky’s great contributions to Russian culture of the latter half of the twentieth century is the wide geographical scope of his poetic and prose works. Brodsky was not a travel writer, but he was a traveling writer who wrote a considerable number of poems and essays which relate to his trips and travels in the Soviet empire and outside it. Travel writing offered for Brodsky a discursive space for negotiating his own transculturation, while it also offered him a discursive space for making powerful statements about displacement, culture, history and geography, time and space—all major themes of his poetry. In this study of Joseph Brodsky’s travel writing I focus on his travel texts in poetry and prose, which relate to his post-1972 trips to Mexico, Brazil, Turkey, and Venice. Questions of empire, tourism, and nostalgia are foregrounded in one way or another in Brodsky’s travel writing performed in emigration. I explore these concepts through the study of tropes, strategies of identity construction, and the politics of representation. The theoretical premises of my work draw on the literary and cultural criticism which has evolved around the study of travel and travel writing in recent years. These approaches have gained much from the scholarly experience provided by postcolonial critique. Shifting the focus away from the concept of exile, the traditional framework for scholarly discussions of Brodsky’s works, I propose to review Brodsky’s travel poetry and prose as a response not only to his exilic condition but to the postmodern and postcolonial landscape, which initially shaped the writing of these texts. Discussing Brodsky’s travel writing in this context offers previously unexplored perspectives for analyzing the geopolitical, philosophical, and linguistic premises of his poetic imagination. By situating Brodsky’s travel writing in the geopolitical landscape of postcolonial postmodernity, I attempt to show how Brodsky’s engagement with his contemporary cultural practices in the West was incorporated into his Russian-language travel poetry and prose and how this engagement thus contributed to these texts’ status as exceptional and unique literary events within late Soviet Russian cultural practices.