925 resultados para building information modelling
Resumo:
Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.
Resumo:
Lyngbya majuscula is a cyanobacterium (blue-green algae) occurring naturally in tropical and subtropical coastal areas worldwide. Deception Bay, in Northern Moreton Bay, Queensland, has a history of Lyngbya blooms, and forms a case study for this investigation. The South East Queensland (SEQ) Healthy Waterways Partnership, collaboration between government, industry, research and the community, was formed to address issues affecting the health of the river catchments and waterways of South East Queensland. The Partnership coordinated the Lyngbya Research and Management Program (2005-2007) which culminated in a Coastal Algal Blooms (CAB) Action Plan for harmful and nuisance algal blooms, such as Lyngbya majuscula. This first phase of the project was predominantly of a scientific nature and also facilitated the collection of additional data to better understand Lyngbya blooms. The second phase of this project, SEQ Healthy Waterways Strategy 2007-2012, is now underway to implement the CAB Action Plan and as such is more management focussed. As part of the first phase of the project, a Science model for the initiation of a Lyngbya bloom was built using Bayesian Networks (BN). The structure of the Science Bayesian Network was built by the Lyngbya Science Working Group (LSWG) which was drawn from diverse disciplines. The BN was then quantified with annual data and expert knowledge. Scenario testing confirmed the expected temporal nature of bloom initiation and it was recommended that the next version of the BN be extended to take this into account. Elicitation for this BN thus occurred at three levels: design, quantification and verification. The first level involved construction of the conceptual model itself, definition of the nodes within the model and identification of sources of information to quantify the nodes. The second level included elicitation of expert opinion and representation of this information in a form suitable for inclusion in the BN. The third and final level concerned the specification of scenarios used to verify the model. The second phase of the project provides the opportunity to update the network with the newly collected detailed data obtained during the previous phase of the project. Specifically the temporal nature of Lyngbya blooms is of interest. Management efforts need to be directed to the most vulnerable periods to bloom initiation in the Bay. To model the temporal aspects of Lyngbya we are using Object Oriented Bayesian networks (OOBN) to create ‘time slices’ for each of the periods of interest during the summer. OOBNs provide a framework to simplify knowledge representation and facilitate reuse of nodes and network fragments. An OOBN is more hierarchical than a traditional BN with any sub-network able to contain other sub-networks. Connectivity between OOBNs is an important feature and allows information flow between the time slices. This study demonstrates more sophisticated use of expert information within Bayesian networks, which combine expert knowledge with data (categorized using expert-defined thresholds) within an expert-defined model structure. Based on the results from the verification process the experts are able to target areas requiring greater precision and those exhibiting temporal behaviour. The time slices incorporate the data for that time period for each of the temporal nodes (instead of using the annual data from the previous static Science BN) and include lag effects to allow the effect from one time slice to flow to the next time slice. We demonstrate a concurrent steady increase in the probability of initiation of a Lyngbya bloom and conclude that the inclusion of temporal aspects in the BN model is consistent with the perceptions of Lyngbya behaviour held by the stakeholders. This extended model provides a more accurate representation of the increased risk of algal blooms in the summer months and show that the opinions elicited to inform a static BN can be readily extended to a dynamic OOBN, providing more comprehensive information for decision makers.
Resumo:
It is well-known that the use of off-site manufacture (OSM) techniques can assist in timely completion of a construction project though the utilisation of such techniques may have other disadvantages. Currently, OSM uptake within the Australian construction industry is limited. To successfully incorporate OSM practices within a construction project, it is crucial to understand the impact of OSM adoption on the processes used during a construction project. This paper presents how a systematic process-oriented approach may be able to support OSM utilisation within a construction project. Process modelling, analysis and automation techniques which are well-known within the Business Process Management (BPM) discipline have been applied to develop a collection of construction process models that represent the end-to-end generic construction value chain. The construction value chain enables researchers to identify key activities, resources, data, and stakeholders involved in construction processes in each defined construction phase. The collection of construction process models is then used as a basis for identification of potential OSM intervention points in collaboration with domain experts from the Australian construction industry. This ensures that the resulting changes reflect the needs of various stakeholders within the construction industry and have relevance in practice. Based on the input from the domain experts, these process models are further refined and operational requirements are taken into account to develop a prototype process automation (workflow) system that can support and coordinate OSM-related process activities. The resulting workflow system also has the potential to integrate with other IT solutions used within the construction industry (e.g., BIM, Acconex). As such, the paper illustrates the role that process-oriented thinking can play in assisting OSM adoption within the industry.
Resumo:
This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution
Resumo:
Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, which has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering is rarely known. Patterns are always thought to be more representative than single terms for representing documents. In this paper, a novel information filtering model, Pattern-based Topic Model(PBTM) , is proposed to represent the text documents not only using the topic distributions at general level but also using semantic pattern representations at detailed specific level, both of which contribute to the accurate document representation and document relevance ranking. Extensive experiments are conducted to evaluate the effectiveness of PBTM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model achieves outstanding performance.
Resumo:
Road traffic injuries are one of the major public health burdens worldwide. The United Nations Decade of Action for Road Safety (2011-2020) implores all nations to work to reduce this burden. This decade represents a unique and historic period of time in the field of road safety. Information exchange and co-operation between nations is an important step in achieving the goal. The burden of road crashes, fatalities and injuries is not equally distributed. We know that low and middle-income countries experience the majority of the road trauma burden. Therefore it is imperative that these countries learn from the successes of others that have developed and implemented road safety laws, public education campaigns and countermeasures over many years and have achieved significant road trauma reductions as a result. China is one of the countries experiencing a large road trauma burden. Vulnerable road users such as pedestrians and cyclists make up a large proportion of fatalities and injuries in China. Speeding, impaired/drug driving, distracted driving, vehicle overloading, inadequate road infrastructure, limited use of safety restraints and helmets, and limited road safety training have all been identified as contributing to the problem. Some important steps have been taken to strengthen China’s approach, including increased penalties for drunk driving in May 2011 and increased attention to school bus safety in 2011/12. However, there is still a large amount of work needed to improve the current road safety position in China. This paper provides details of a program to assist with road safety knowledge exchange between China and Australia that was funded by the Australian Government which was undertaken in the latter part of 2012. The four month program provided the opportunity for the first author to work closely with key agencies in Australia that are responsible for policy development and implementation of a broad range of road safety initiatives. In doing so, an in-depth understanding was gained about key road safety strategies in Australia and processes for developing and implementing them. Insights were also gained into the mechanisms used for road safety policy development, implementation and evaluation in several Australian jurisdictions. Road traffic law and enforcement issues were explored with the relevant jurisdictional transport and police agencies to provide a greater understanding of how Chinese laws and practices could be enhanced. Working with agencies responsible for public education and awareness campaigns about road safety in Australia also provided relevant information about how to promote road safety at the broader community level in China. Finally, the program provided opportunities to work closely with several world-renowned Australian research centres and key expert researchers to enhance opportunities for ongoing road safety research in China. The overall program provided the opportunity for the first author to develop knowledge in key areas of road safety strategy development, implementation and management which are directly relevant to the current situation in China. This paper describes some main observations and findings from participation in the program.
Resumo:
This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.
Resumo:
L'intérêt suscité par la ré-ingénierie des processus et les technologies de l'information révèle l'émergence du paradigme du management par les processus. Bien que beaucoup d'études aient été publiées sur des outils et techniques alternatives de modélisation de processus, peu d'attention a été portée à l'évaluation post-hoc des activités de modélisation de processus ou à l'établissement de directives sur la façon de conduire efficacement une modélisation de processus. La présente étude a pour objectif de combler ce manque. Nous présentons les résultats d'une étude de cas détaillée, conduite dans une organisation leader australienne dans le but de construire un modèle de réussite de la modélisation des processus.
Resumo:
Object classification is plagued by the issue of session variation. Session variation describes any variation that makes one instance of an object look different to another, for instance due to pose or illumination variation. Recent work in the challenging task of face verification has shown that session variability modelling provides a mechanism to overcome some of these limitations. However, for computer vision purposes, it has only been applied in the limited setting of face verification. In this paper we propose a local region based intersession variability (ISV) modelling approach, and apply it to challenging real-world data. We propose a region based session variability modelling approach so that local session variations can be modelled, termed Local ISV. We then demonstrate the efficacy of this technique on a challenging real-world fish image database which includes images taken underwater, providing significant real-world session variations. This Local ISV approach provides a relative performance improvement of, on average, 23% on the challenging MOBIO, Multi-PIE and SCface face databases. It also provides a relative performance improvement of 35% on our challenging fish image dataset.
Resumo:
Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.
Resumo:
Security models for two-party authenticated key exchange (AKE) protocols have developed over time to prove the security of AKE protocols even when the adversary learns certain secret values. In this work, we address more granular leakage: partial leakage of long-term secrets of protocol principals, even after the session key is established. We introduce a generic key exchange security model, which can be instantiated allowing bounded or continuous leakage, even when the adversary learns certain ephemeral secrets or session keys. Our model is the strongest known partial-leakage-based security model for key exchange protocols. We propose a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the proposed model, by introducing a new concept: the leakage-resilient NAXOS trick. We identify a special property for public-key cryptosystems: pair generation indistinguishability, and show how to obtain the leakage-resilient NAXOS trick from a pair generation indistinguishable leakage-resilient public-key cryptosystem.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
Spatially-explicit modelling of grassland classes is important to site-specific planning for improving grassland and environmental management over large areas. In this study, a climate-based grassland classification model, the Comprehensive and Sequential Classification System (CSCS) was integrated with spatially interpolated climate data to classify grassland in Gansu province, China. The study area is characterized by complex topographic features imposed by plateaus, high mountains, basins and deserts. To improve the quality of the interpolated climate data and the quality of the spatial classification over this complex topography, three linear regression methods, namely an analytic method based on multiple regression and residues (AMMRR), a modification of the AMMRR method through adding the effect of slope and aspect to the interpolation analysis (M-AMMRR) and a method which replaces the IDW approach for residue interpolation in M-AMMRR with an ordinary kriging approach (I-AMMRR), for interpolating climate variables were evaluated. The interpolation outcomes from the best interpolation method were then used in the CSCS model to classify the grassland in the study area. Climate variables interpolated included the annual cumulative temperature and annual total precipitation. The results indicated that the AMMRR and M-AMMRR methods generated acceptable climate surfaces but the best model fit and cross validation result were achieved by the I-AMMRR method. Twenty-six grassland classes were classified for the study area. The four grassland vegetation classes that covered more than half of the total study area were "cool temperate-arid temperate zonal semi-desert", "cool temperate-humid forest steppe and deciduous broad-leaved forest", "temperate-extra-arid temperate zonal desert", and "frigid per-humid rain tundra and alpine meadow". The vegetation classification map generated in this study provides spatial information on the locations and extents of the different grassland classes. This information can be used to facilitate government agencies' decision-making in land-use planning and environmental management, and for vegetation and biodiversity conservation. The information can also be used to assist land managers in the estimation of safe carrying capacities which will help to prevent overgrazing and land degradation.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
In this paper, we provide the results of a field study of a Ubicomp system called CAM (Cooperative Artefact Memory) in a Product Design studio. CAM is a mobile-tagging based messaging system that allows designers to store relevant information onto their design artefacts in the form of messages, annotations and external web links. From our field study results, we observe that the use of CAM adds another shared ‘space’ onto these design artefacts – that are in their natural settings boundary objects themselves. In the paper, we provide several examples from the field illustrating how CAM helps in the design process.