975 resultados para product modelling
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve in ISO834 [1]. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the fire performance of LSF walls was undertaken using realistic design fire curves developed based on Eurocode parametric [2] and Barnett’s BFD [3] curves using both full scale fire tests and numerical studies. It included LSF walls without cavity insulation, and the recently developed externally insulated composite panel system. This paper presents the details of finite element models developed to simulate the full scale fire tests of LSF wall panels under realistic design fires. Finite element models of LSF walls exposed to realistic design fires were developed, and analysed under both transient and steady state fire conditions using the measured stud time-temperature curves. Transient state analyses were performed to simulate fire test conditions while steady state analyses were performed to obtain the load ratio versus time and failure temperature curves of LSF walls. Details of the developed finite element models and the results including the axial deformation and lateral deflection versus time curves, and the stud failure modes and times are presented in this paper. Comparison with fire test results demonstrate the ability of developed finite element models to predict the performance and fire resistance ratings of LSF walls under realistic design fires.
Resumo:
Computational models represent a highly suitable framework, not only for testing biological hypotheses and generating new ones but also for optimising experimental strategies. As one surveys the literature devoted to cancer modelling, it is obvious that immense progress has been made in applying simulation techniques to the study of cancer biology, although the full impact has yet to be realised. For example, there are excellent models to describe cancer incidence rates or factors for early disease detection, but these predictions are unable to explain the functional and molecular changes that are associated with tumour progression. In addition, it is crucial that interactions between mechanical effects, and intracellular and intercellular signalling are incorporated in order to understand cancer growth, its interaction with the extracellular microenvironment and invasion of secondary sites. There is a compelling need to tailor new, physiologically relevant in silico models that are specialised for particular types of cancer, such as ovarian cancer owing to its unique route of metastasis, which are capable of investigating anti-cancer therapies, and generating both qualitative and quantitative predictions. This Commentary will focus on how computational simulation approaches can advance our understanding of ovarian cancer progression and treatment, in particular, with the help of multicellular cancer spheroids, and thus, can inform biological hypothesis and experimental design.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.
Resumo:
This research looked at using the metaphor of biological evolution as a way of solving architectural design problems. Drawing from fields such as language grammars, algorithms and cellular biology, this thesis looked at ways of encoding design information for processing. The aim of this work is to help in the building of software that support the architectural design process and allow designers to examine more variations.
Resumo:
Past approaches adopted by scholars in comparing international news have tended to concentrate on political and economic perspectives, while the role that culture plays in determining news has been somewhat neglected until recently. This article examines the role of culture in the development of journalistic practices and how a value systems approach can be applied to understanding journalism practices across cultures. Specifically, the article compares German and Anglo-American journalism practices with a view to locating differences between these traditions. The study demonstrates that using value systems as developed by Dutch anthropologist Geert Hofstede can be immensely useful in comparing the differences between the two traditions, as well as in understanding how journalists in these traditions report about the world.
Resumo:
Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.
Resumo:
Security models for two-party authenticated key exchange (AKE) protocols have developed over time to prove the security of AKE protocols even when the adversary learns certain secret values. In this work, we address more granular leakage: partial leakage of long-term secrets of protocol principals, even after the session key is established. We introduce a generic key exchange security model, which can be instantiated allowing bounded or continuous leakage, even when the adversary learns certain ephemeral secrets or session keys. Our model is the strongest known partial-leakage-based security model for key exchange protocols. We propose a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the proposed model, by introducing a new concept: the leakage-resilient NAXOS trick. We identify a special property for public-key cryptosystems: pair generation indistinguishability, and show how to obtain the leakage-resilient NAXOS trick from a pair generation indistinguishable leakage-resilient public-key cryptosystem.
Resumo:
The car has arguably had more influence on our lifestyle and urban environment than any other consumer product; allowing unprecedented freedom for living, working and recreation where and when we choose. However, problems of pollution, congestion, road trauma, inefficient land use and social inequality are associated with car use. Despite 100 years of design and technology refinements, the aforementioned problems are significant and persistent: many argue that resolving these problems requires a fundamental redesign of the car. Redesigned vehicles have been proposed such as the MIT CityCar and others such as the Renault Twizy, commercialized. None however have successfully brought about significant change and the study of disruptive innovation offers an explanation for this. Disruptive innovation, by definition, disrupts a market. It also disrupts the product ecosystem. The existing product ecosystem has co-evolved to support the conventional car and is not optimized for the new design: which will require a redesigned ecosystem to support it. A literature review identifies a lack of methodology for identifying the components of product ecosystems and the changes required for disruptive innovation implementation. This paper proposes such a methodology based on Design Thinking, Actor Network Theory, Disruptive Innovation and the CityCar scenarios.
Resumo:
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.
Resumo:
Capability development is at the heart of creating competitive advantage. This thesis intends to conceptualise Strategic Capability Development as a renewal of an organisation's existing capability in line with the requirements of the market. It followed and compared four product innovation projects within Iran Khodro Company (IKCO), an exemplar of capability development within the Iranian Auto industry. Findings show that the maturation of strategic capability at the organisational level has occurred through a sequence of product innovation projects and by dynamically shaping the learning and knowledge integration processes in accordance with emergence of the new structure within the industry. Accordingly, Strategic Capability Development is conceptualised in an interpretive model. Such findings are useful for development of an explanatory model and a practical capability development framework for managing learning and knowledge across different product innovation projects.
Resumo:
The ultraviolet photodissociation of gas-phase N-methylpyridinium ions is studied at room temperature using laser photodissociation mass spectrometry and structurally diagnostic ion-molecule reaction kinetics. The C5H5N-CH3+ (m/z 94), C5H5N-CD3+ (m/z 97), and C5D5N-CH3+(m/z 99) isotopologues are investigated, and it is shown that the N-methylpyridinium ion photodissociates by the loss of methane in the 36 000 - 43 000 cm(-1) (280 - 230 nm) region. The dissociation likely occurs on the ground state surface following internal conversion from the SI state. For each isotopologue, by monitoring the photofragmentation yield as a function of photon wavenumber, a broad vibronically featured band is recorded with origin (0-0) transitions assigned at 38 130, 38 140 and 38 320 cm(-1) for C5H5N-CH3+ C5H5N-CD3+ and C5D5N-CH3+, respectively. With the aid of quantum chemical calculations (CASSCF(6,6)/aug-cc-pVDZ), most of the observed vibronic detail is assigned to two in-plane ring deformation modes. Finally, using ion-molecule reactions, the methane coproduct at m/z 78 is confirmed as a 2-pyridinylium ion.
Resumo:
Ozone-induced dissociation (OzID) is an alternative ion activation method that relies on the gas phase ion-molecule reaction between a mass-selected target ion and ozone in an ion trap mass spectrometer. Herein, we evaluated the performance of OzID for both the structural elucidation and selective detection of conjugated carbon-carbon double bond motifs within lipids. The relative reactivity trends for \[M + X](+) ions (where X = Li, Na, K) formed via electrospray ionization (ESI) of conjugated versus nonconjugated fatty acid methyl esters (FAMEs) were examined using two different OzID-enabled linear ion-trap mass spectrometers. Compared with nonconjugated analogues, FAMEs derived from conjugated linoleic acids were found to react up to 200 times faster and to yield characteristic radical cations. The significantly enhanced reactivity of conjugated isomers means that OzID product ions can be observed without invoking a reaction delay in the experimental sequence (i.e., trapping of ions in the presence of ozone is not required). This possibility has been exploited to undertake neutral-loss scans on a triple quadrupole mass spectrometer targeting characteristic OzID transitions. Such analyses reveal the presence of conjugated double bonds in lipids extracted from selected foodstuffs. Finally, by benchmarking of the absolute ozone concentration inside the ion trap, second order rate constants for the gas phase reactions between unsaturated organic ions and ozone were obtained. These results demonstrate a significant influence of the adducting metal on reaction rate constants in the fashion Li > Na > K.
Resumo:
Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
Contribution to a curated book of Australian 45rpm single covers. This contribution is a discussion about the possibilities afforded by available technologies in the late 70s in the production of music-related artwork: record covers, posters, handbills etc.