720 resultados para thermal models
Resumo:
Identifying railway capacity is an important task that can identify "in principal" whether the network can handle an intended traffic flow, and whether there is any free capacity left for additional train services. Capacity determination techniques can also be used to identify how best to improve an existing network, and at least cost. In this article an optimization approach has been applied to a case study of the Iran national railway, in order to identify its current capacity and to optimally expand it given a variety of technical conditions. This railway is very important in Iran and will be upgraded extensively in the coming years. Hence the conclusions in this article may help in that endeavor. A sensitivity analysis is recommended to evaluate a wider range of possible scenarios. Hence more useful lower and upper bounds can be provided for the performance of the system
Resumo:
The thermal decomposition process of kaolinite–potassium acetate intercalation complex has been studied using simultaneous thermogravimetry coupled with Fourier-transform infrared spectroscopy and mass spectrometry (TG-FTIR-MS). The results showed that the thermal decomposition of the complex took place in four temperature ranges, namely 50–100, 260–320, 320–550, and 650–780 °C. The maximal mass losses rate for the thermal decomposition of the kaolinite–potassium acetate intercalation complex was observed at 81, 296, 378, 411, 486, and 733 °C, which was attributed to (a) loss of the adsorbed water, (b) thermal decomposition of surface-adsorbed potassium acetate (KAc), (c) the loss of the water coordinated to potassium acetate in the intercalated kaolinite, (d) the thermal decomposition of intercalated KAc in the interlayer of kaolinite and the removal of inner surface hydroxyls, (e) the loss of the inner hydroxyls, and (f) the thermal decomposition of carbonate derived from the decomposition of KAc. The thermal decomposition of intercalated potassium acetate started in the range 320–550 °C accompanied by the release of water, acetone, carbon dioxide, and acetic acid. The identification of pyrolysis fragment ions provided insight into the thermal decomposition mechanism. The results showed that the main decomposition fragment ions of the kaolinite–KAc intercalation complex were water, acetone, carbon dioxide, and acetic acid. TG-FTIR-MS was demonstrated to be a powerful tool for the investigation of kaolinite intercalation complexes. It delivers a detailed insight into the thermal decomposition processes of the kaolinite intercalation complexes characterized by mass loss and the evolved gases.
Resumo:
Approximately half of prostate cancers (PCa) carry TMPRSS2-ERG translocations; however, the clinical impact of this genomic alteration remains enigmatic. Expression of v-ets erythroblastosis virus E26 oncogene like (avian) gene (ERG) promotes prostatic epithelial dysplasia in transgenic mice and acquisition of epithelial-to-mesenchymal transition (EMT) characteristics in human prostatic epithelial cells (PrECs). To explore whether ERG-induced EMT in PrECs was associated with therapeutically targetable transformation characteristics, we established stable populations of BPH-1, PNT1B and RWPE-1 immortalized human PrEC lines that constitutively express flag-tagged ERG3 (fERG). All fERG-expressing populations exhibited characteristics of in vitro and in vivo transformation. Microarray analysis revealed >2000 commonly dysregulated genes in the fERG-PrEC lines. Functional analysis revealed evidence that fERG cells underwent EMT and acquired invasive characteristics. The fERG-induced EMT transcript signature was exemplified by suppressed expression of E-cadherin and keratins 5, 8, 14 and 18; elevated expression of N-cadherin, N-cadherin 2 and vimentin, and of the EMT transcriptional regulators Snail, Zeb1 and Zeb2, and lymphoid enhancer-binding factor-1 (LEF-1). In BPH-1 and RWPE-1-fERG cells, fERG expression is correlated with increased expression of integrin-linked kinase (ILK) and its downstream effectors Snail and LEF-1. Interfering RNA suppression of ERG decreased expression of ILK, Snail and LEF-1, whereas small interfering RNA suppression of ILK did not alter fERG expression. Interfering RNA suppression of ERG or ILK impaired fERG-PrEC Matrigel invasion. Treating fERG-BPH-1 cells with the small molecule ILK inhibitor, QLT-0267, resulted in dose-dependent suppression of Snail and LEF-1 expression, Matrigel invasion and reversion of anchorage-independent growth. These results suggest that ILK is a therapeutically targetable mediator of ERG-induced EMT and transformation in PCa.
Resumo:
Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.
Resumo:
Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression
Resumo:
Finite element (FE) model studies have made important contributions to our understanding of functional biomechanics of the lumbar spine. However, if a model is used to answer clinical and biomechanical questions over a certain population, their inherently large inter-subject variability has to be considered. Current FE model studies, however, generally account only for a single distinct spinal geometry with one set of material properties. This raises questions concerning their predictive power, their range of results and on their agreement with in vitro and in vivo values. Eight well-established FE models of the lumbar spine (L1-5) of different research centres around the globe were subjected to pure and combined loading modes and compared to in vitro and in vivo measurements for intervertebral rotations, disc pressures and facet joint forces. Under pure moment loading, the predicted L1-5 rotations of almost all models fell within the reported in vitro ranges, and their median values differed on average by only 2° for flexion-extension, 1° for lateral bending and 5° for axial rotation. Predicted median facet joint forces and disc pressures were also in good agreement with published median in vitro values. However, the ranges of predictions were larger and exceeded those reported in vitro, especially for the facet joint forces. For all combined loading modes, except for flexion, predicted median segmental intervertebral rotations and disc pressures were in good agreement with measured in vivo values. In light of high inter-subject variability, the generalization of results of a single model to a population remains a concern. This study demonstrated that the pooled median of individual model results, similar to a probabilistic approach, can be used as an improved predictive tool in order to estimate the response of the lumbar spine.
Resumo:
It is often said that Australia is a world leader in rates of copyright infringement for entertainment goods. In 2012, the hit television show, Game of Thrones, was the most downloaded television show over bitorrent, and estimates suggest that Australians accounted for a plurality of nearly 10% of the 3-4 million downloads each week. The season finale of 2013 was downloaded over a million times within 24 hours of its release, and again Australians were the largest block of illicit downloaders over BitTorrent, despite our relatively small population. This trend has led the former US Ambassador to Australia to implore Australians to stop 'stealing' digital content, and rightsholders to push for increasing sanctions on copyright infringers. The Australian Government is looking to respond by requiring Internet Service Providers to issue warnings and potentially punish consumers who are alleged by industry groups to have infringed copyright. This is the logical next step in deterring infringement, given that the operators of infringing networks (like The Pirate Bay, for example) are out of regulatory reach. This steady ratcheting up of the strength of copyright, however, comes at a significant cost to user privacy and autonomy, and while the decentralisation of enforcement reduces costs, it also reduces the due process safeguards provided by the judicial process. This article presents qualitative evidence that substantiates a common intuition: one of the major reasons that Australians seek out illicit downloads of content like Game of Thrones in such numbers is that it is more difficult to access legitimately in Australia. The geographically segmented way in which copyright is exploited at an international level has given rise to a ‘tyranny of digital distance’, where Australians have less access to copyright goods than consumers in other countries. Compared to consumers in the US and the EU, Australians pay more for digital goods, have less choice in distribution channels, are exposed to substantial delays in access, and are sometimes denied access completely. In this article we focus our analysis on premium film and television offerings, like Game of Thrones, and through semi-structured interviews, explore how choices in distribution impact on the willingness of Australian consumers to seek out infringing copies of copyright material. Game of Thrones provides an excellent case study through which to frame this analysis: it is both one of the least legally accessible television offerings and one of the most downloaded through filesharing networks of recent times. Our analysis shows that at the same time as rightsholder groups, particularly in the film and television industries, are lobbying for stronger laws to counter illicit distribution, the business practices of their member organisations are counter-productively increasing incentives for consumers to infringe. The lack of accessibility and high prices of copyright goods in Australia leads to substantial economic waste. The unmet consumer demand means that Australian consumers are harmed by lower access to information and entertainment goods than consumers in other jurisdictions. The higher rates of infringement that fulfils some of this unmet demand increases enforcement costs for copyright owners and imposes burdens either on our judicial system or on private entities – like ISPs – who may be tasked with enforcing the rights of third parties. Most worryingly, the lack of convenient and cheap legitimate digital distribution channels risks undermining public support for copyright law. Our research shows that consumers blame rightsholders for failing to meet market demand, and this encourages a social norm that infringing copyright, while illegal, is not morally wrongful. The implications are as simple as they are profound: Australia should not take steps to increase the strength of copyright law at this time. The interests of the public and those of rightsholders align better when there is effective competition in distribution channels and consumers can legitimately get access to content. While foreign rightsholders are seeking enhanced protection for their interests, increasing enforcement is likely to increase their ability to engage in lucrative geographical price-discrimination, particularly for premium content. This is only likely to increase the degree to which Australian consumers feel that their interests are not being met and, consequently, to further undermine the legitimacy of copyright law. If consumers are to respect copyright law, increasing sanctions for infringement without enhancing access and competition in legitimate distribution channels could be dangerously counter-productive. We suggest that rightsholders’ best strategy for addressing infringement in Australia at this time is to ensure that Australians can access copyright goods in a timely, affordable, convenient, and fair lawful manner.
Resumo:
Agent-based modeling and simulation (ABMS) may fit well with entrepreneurship research and practice because the core concepts and basic premises of entrepreneurship coincide with the characteristics of ABMS. However, it is difficult to find cases where ABMS is applied to entrepreneurship research. To apply ABMS to entrepreneurship and organization studies, designing a conceptual model is important; thus to effectively design a conceptual model, various mixed method approaches are being attempted. As a new mixed method approach to ABMS, this study proposes a bibliometric approach to designing agent based models, which establishes and analyzes a domain corpus. This study presents an example on the venture creation process using the bibliometric approach. This example shows us that the results of the multi-agent simulations on the venturing process based on the bibliometric approach are close to each nation’s surveyed data on the venturing activities. In conclusion, by the bibliometric approach proposed in this study, all the agents and the agents’ behaviors related to a phenomenon can be extracted effectively, and a conceptual model for ABMS can be designed with the agents and their behaviors. This study contributes to the entrepreneurship and organization studies by promoting the application of ABMS.
Resumo:
This digital poster (which was on display at "The Cube", Queensland University of Technology) demonstrates how specification parameters can be extracted from a product library repository for use in augmenting the information contents of the objects in a local BIM tool (Revit in this instance).
Resumo:
Cold-formed steel members are widely used in load bearing Light gauge steel frame (LSF) wall systems with plasterboard linings on both sides. However, these thin-walled steel sections heat up quickly and lose their strength under fire conditions despite the protection provided by plasterboards. Hence there is a need for simple fire design rules to predict their load capacities and fire resistance ratings. During fire events, the LSF wall studs are subjected to non-uniform temperature distributions that cause thermal bowing, neutral axis shift and magnification effects and thus resulting in a combined axial compression and bending action on the LSF wall studs. In this research a series of full scale fire tests was conducted first to evaluate the performance of LSF wall systems with eight different wall configurations under standard fire conditions. Finite element models of LSF walls were then developed, analysed under transient and steady state conditions, and validated using full scale fire tests. Using the results from fire tests and finite element analyses, a detailed investigation was undertaken into the prediction of axial compression strength and failure times of LSF wall studs in standard fires using the available fire design rules based on Australian, American and European standards. The results from both fire tests and finite element analyses were used to investigate the ability of these fire design rules to include the complex effects of non-uniform temperature distributions and their accuracy in predicting the axial compression strengths of wall studs and the failure times. Suitable modifications were then proposed to the fire design rules. This paper presents the details of this investigation into the accuracy of using currently available fire design rules of LSF walls and the results.
Resumo:
Light Gauge Steel Framing (LSF) walls made of cold-formed and thin-walled steel lipped channel studs with plasterboard linings on both sides are commonly used in commercial, industrial and residential buildings. However, there is limited data about their structural and thermal performances under fire conditions. Recent research at the Queensland University of Technology has investigated the structural and thermal behaviour of load bearing LSF wall systems. In this research a series of full scale fire tests was conducted first to evaluate the performance of LSF wall systems with eight different wall configurations under standard fire conditions. Finite element models of LSF walls were then developed, analysed under transient and steady state conditions, and validated using full scale fire tests. This paper presents the details of an investigation into the fire performance of LSF wall panels based on an extensive finite element analysis based parametric study. The LSF wall panels with eight different plasterboard-insulation configurations were considered under standard fire conditions. Effects of varying steel grades, steel thicknesses, screw spacing, plasterboard restraint, insulation materials and load ratio on the fire performance of LSF walls were investigated and the results of extensive fire performance data are presented in the form of load ratio versus time and critical hot flange (failure) temperature curves.
Resumo:
The purpose of this book by two Australian authors is to: introduce the audience to the full complement of contextual elements found within program theory; offer practical suggestions to engage with theories of change, theories of action and logic models; and provide substantial evidence for this approach through scholarly literature, practice case studies together with the authors' combined experience of 60 years.
Resumo:
Building information models have created a paradigm shift in how buildings are built and managed by providing a dynamic repository for building data that is useful in many new operational scenarios. This change has also created an opportunity to use building information models as an integral part of security operations and especially as a tool to facilitate fine-grained access control to building spaces in smart buildings and critical infrastructure environments. In this paper, we identify the requirements for a security policy model for such an access control system and discuss why the existing policy models are not suitable for this application. We propose a new policy language extension to XACML, with BIM specific data types and functions based on the IFC specification, which we call BIM-XACML.
Resumo:
Building information models are increasingly being utilised for facility management of large facilities such as critical infrastructures. In such environments, it is valuable to utilise the vast amount of data contained within the building information models to improve access control administration. The use of building information models in access control scenarios can provide 3D visualisation of buildings as well as many other advantages such as automation of essential tasks including path finding, consistency detection, and accessibility verification. However, there is no mathematical model for building information models that can be used to describe and compute these functions. In this paper, we show how graph theory can be utilised as a representation language of building information models and the proposed security related functions. This graph-theoretic representation allows for mathematically representing building information models and performing computations using these functions.