620 resultados para Gravity modelling
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve in ISO834 [1]. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the fire performance of LSF walls was undertaken using realistic design fire curves developed based on Eurocode parametric [2] and Barnett’s BFD [3] curves using both full scale fire tests and numerical studies. It included LSF walls without cavity insulation, and the recently developed externally insulated composite panel system. This paper presents the details of finite element models developed to simulate the full scale fire tests of LSF wall panels under realistic design fires. Finite element models of LSF walls exposed to realistic design fires were developed, and analysed under both transient and steady state fire conditions using the measured stud time-temperature curves. Transient state analyses were performed to simulate fire test conditions while steady state analyses were performed to obtain the load ratio versus time and failure temperature curves of LSF walls. Details of the developed finite element models and the results including the axial deformation and lateral deflection versus time curves, and the stud failure modes and times are presented in this paper. Comparison with fire test results demonstrate the ability of developed finite element models to predict the performance and fire resistance ratings of LSF walls under realistic design fires.
Resumo:
Computational models represent a highly suitable framework, not only for testing biological hypotheses and generating new ones but also for optimising experimental strategies. As one surveys the literature devoted to cancer modelling, it is obvious that immense progress has been made in applying simulation techniques to the study of cancer biology, although the full impact has yet to be realised. For example, there are excellent models to describe cancer incidence rates or factors for early disease detection, but these predictions are unable to explain the functional and molecular changes that are associated with tumour progression. In addition, it is crucial that interactions between mechanical effects, and intracellular and intercellular signalling are incorporated in order to understand cancer growth, its interaction with the extracellular microenvironment and invasion of secondary sites. There is a compelling need to tailor new, physiologically relevant in silico models that are specialised for particular types of cancer, such as ovarian cancer owing to its unique route of metastasis, which are capable of investigating anti-cancer therapies, and generating both qualitative and quantitative predictions. This Commentary will focus on how computational simulation approaches can advance our understanding of ovarian cancer progression and treatment, in particular, with the help of multicellular cancer spheroids, and thus, can inform biological hypothesis and experimental design.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.
Resumo:
This research looked at using the metaphor of biological evolution as a way of solving architectural design problems. Drawing from fields such as language grammars, algorithms and cellular biology, this thesis looked at ways of encoding design information for processing. The aim of this work is to help in the building of software that support the architectural design process and allow designers to examine more variations.
Resumo:
Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.
Resumo:
Security models for two-party authenticated key exchange (AKE) protocols have developed over time to prove the security of AKE protocols even when the adversary learns certain secret values. In this work, we address more granular leakage: partial leakage of long-term secrets of protocol principals, even after the session key is established. We introduce a generic key exchange security model, which can be instantiated allowing bounded or continuous leakage, even when the adversary learns certain ephemeral secrets or session keys. Our model is the strongest known partial-leakage-based security model for key exchange protocols. We propose a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the proposed model, by introducing a new concept: the leakage-resilient NAXOS trick. We identify a special property for public-key cryptosystems: pair generation indistinguishability, and show how to obtain the leakage-resilient NAXOS trick from a pair generation indistinguishable leakage-resilient public-key cryptosystem.
Resumo:
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.
Resumo:
Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
Spatially-explicit modelling of grassland classes is important to site-specific planning for improving grassland and environmental management over large areas. In this study, a climate-based grassland classification model, the Comprehensive and Sequential Classification System (CSCS) was integrated with spatially interpolated climate data to classify grassland in Gansu province, China. The study area is characterized by complex topographic features imposed by plateaus, high mountains, basins and deserts. To improve the quality of the interpolated climate data and the quality of the spatial classification over this complex topography, three linear regression methods, namely an analytic method based on multiple regression and residues (AMMRR), a modification of the AMMRR method through adding the effect of slope and aspect to the interpolation analysis (M-AMMRR) and a method which replaces the IDW approach for residue interpolation in M-AMMRR with an ordinary kriging approach (I-AMMRR), for interpolating climate variables were evaluated. The interpolation outcomes from the best interpolation method were then used in the CSCS model to classify the grassland in the study area. Climate variables interpolated included the annual cumulative temperature and annual total precipitation. The results indicated that the AMMRR and M-AMMRR methods generated acceptable climate surfaces but the best model fit and cross validation result were achieved by the I-AMMRR method. Twenty-six grassland classes were classified for the study area. The four grassland vegetation classes that covered more than half of the total study area were "cool temperate-arid temperate zonal semi-desert", "cool temperate-humid forest steppe and deciduous broad-leaved forest", "temperate-extra-arid temperate zonal desert", and "frigid per-humid rain tundra and alpine meadow". The vegetation classification map generated in this study provides spatial information on the locations and extents of the different grassland classes. This information can be used to facilitate government agencies' decision-making in land-use planning and environmental management, and for vegetation and biodiversity conservation. The information can also be used to assist land managers in the estimation of safe carrying capacities which will help to prevent overgrazing and land degradation.
Resumo:
This chapter addresses data modelling as a means of promoting statistical literacy in the early grades. Consideration is first given to the importance of increasing young children’s exposure to statistical reasoning experiences and how data modelling can be a rich means of doing so. Selected components of data modelling are then reviewed, followed by a report on some findings from the third-year of a three-year longitudinal study across grades one through three.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
Non-profit organizations (NPOs) are major providers of services in many fields of endeavour, and often receive financial support from government. This article investigates different forms of government/nonprofit funding relationships, with the viewpoint being mainly, though not exclusively, from the perspective of the non-profit agencies. While there are a number of existing typologies of government/NPO relations, these are dated and in need of further empirical analysis and testing. The article advances an empirically derived extension to current models of government/NPO relations. A future research agenda is outlined based on the constructs that underpin typologies, rather than discrete categorization of relationships.
Resumo:
Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. Keywords: Climate; Dengue; Models; Projection; Scenarios
Resumo:
A numerical investigation of the behaviour of fuel injection through a porous surface in an inlet-fuelled, radial-farming scramjet is presented. The performance of porous fuel injection is compared to discrete port hole injection at an equivalence ratio of φ ≈ 0.4 for both cases. The comparison is performed at a Mach 6.5 flow condition with a total specific enthalpy of 4.3 MJ/kg. The numerical results are compared to experiments performed in the T4 shock tunnel where available. The presented results demonstrate for the first time, that porous fuel injection has the potential to outperform port hole injectors in scramjet engines in terms of fuel-air mixing, ignition delays and achievable combustion efficiencies despite reduced fuel penetration heights.