728 resultados para post-deformation softening modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The world is increasingly moving towards more open models of publishing and communication. The UK government has demonstrated a firm commitment to ensuring that academic research outputs are made available to all who might benefit from access to them, and its open access policy attempts to make academic publications freely available to readers, rather than being locked behind pay walls or only available to researchers with access to well-funded university libraries. Open access policies have an important role to play in fostering an open innovation ecosystem and ensuring that maximum value is derived from investments in university-based research. But are we ready to embrace this change?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to validate the Children’s Eating Behaviour Questionnaire (CEBQ) in three ethnically and culturally diverse samples of mothers in Australia. Confirmatory factor analysis utilising structural equation modelling examined whether the established 8-factor model of the CEBQ was supported in our three populations: (i) a community sample of first-time mothers allocated to the control group of the NOURISH trial (mean child age = 24 months [SD = 1]; N = 244); (ii) a sample of immigrant Indian mothers of children aged 1–5 years (mean age = 34 months [SD = 14]; N = 203), and (iii) a sample of immigrant Chinese mothers of children aged 1–4 years (mean age = 36 months [SD = 14]; N = 216). The original 8-factor model provided an acceptable fit to the data in the NOURISH sample with minor post hoc re-specifications (two error covariances on Satiety Responsiveness and an item-factor covariance to account for a cross-loading of an item (Fussiness) on Satiety Responsiveness). The re-specified model showed reasonable fit in both the Indian and Chinese samples. Cronbach’s α estimates ranged from .73 to .91 in the Australian sample and .61–.88 in the immigrant samples. This study supports the appropriateness of the CEBQ in the multicultural Australian context.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models represent a highly suitable framework, not only for testing biological hypotheses and generating new ones but also for optimising experimental strategies. As one surveys the literature devoted to cancer modelling, it is obvious that immense progress has been made in applying simulation techniques to the study of cancer biology, although the full impact has yet to be realised. For example, there are excellent models to describe cancer incidence rates or factors for early disease detection, but these predictions are unable to explain the functional and molecular changes that are associated with tumour progression. In addition, it is crucial that interactions between mechanical effects, and intracellular and intercellular signalling are incorporated in order to understand cancer growth, its interaction with the extracellular microenvironment and invasion of secondary sites. There is a compelling need to tailor new, physiologically relevant in silico models that are specialised for particular types of cancer, such as ovarian cancer owing to its unique route of metastasis, which are capable of investigating anti-cancer therapies, and generating both qualitative and quantitative predictions. This Commentary will focus on how computational simulation approaches can advance our understanding of ovarian cancer progression and treatment, in particular, with the help of multicellular cancer spheroids, and thus, can inform biological hypothesis and experimental design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses finite element techniques to investigate the performance of buried tunnels subjected to surface blasts incorporating fully coupled Fluid Structure Interaction and appropriate material models which simulate strain rate effects. Modelling techniques are first validated against existing experimental results and then used to treat the blast induced shock wave propagation and tunnel response in dry and saturated sands. Results show that the tunnel buried in saturated sand responds earlier than that in dry sand. Tunnel deformations decrease with distance from explosive in both sands, as expected. In the vicinity of the explosive, the tunnel buried in saturated sand suffered permanent deformation in both axial and circumferential directions, whereas the tunnel buried in dry sand recovered from most of the axial deformation. Overall, response of the tunnel in saturated sand is more severe for a given blast event and shows the detrimental effect of pore water on the blast response of buried tunnels. The validated modelling techniques developed in this paper can be used to investigate the blast response of tunnels buried in dry and saturated sands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research looked at using the metaphor of biological evolution as a way of solving architectural design problems. Drawing from fields such as language grammars, algorithms and cellular biology, this thesis looked at ways of encoding design information for processing. The aim of this work is to help in the building of software that support the architectural design process and allow designers to examine more variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security models for two-party authenticated key exchange (AKE) protocols have developed over time to prove the security of AKE protocols even when the adversary learns certain secret values. In this work, we address more granular leakage: partial leakage of long-term secrets of protocol principals, even after the session key is established. We introduce a generic key exchange security model, which can be instantiated allowing bounded or continuous leakage, even when the adversary learns certain ephemeral secrets or session keys. Our model is the strongest known partial-leakage-based security model for key exchange protocols. We propose a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the proposed model, by introducing a new concept: the leakage-resilient NAXOS trick. We identify a special property for public-key cryptosystems: pair generation indistinguishability, and show how to obtain the leakage-resilient NAXOS trick from a pair generation indistinguishable leakage-resilient public-key cryptosystem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Insulated rail joints are critical for train safety as they control electrical signalling systems; unfortunately they exhibit excessive ratchetting of the railhead near the endpost insulators. This paper reports a three-dimensional global model of these joints under wheel–rail contact pressure loading and a sub-model examining the ratchetting failures of the railhead. The sub-model employs a non-linear isotropic–kinematic elastic–plastic material model and predicts stress/strain levels in the localised railhead zone adjacent to the endpost which is placed in the air gap between the two rail ends at the insulated rail joint. The equivalent plastic strain plot is utilised to capture the progressive railhead damage adequately. Associated field and laboratory testing results of damage to the railhead material suggest that the simulation results are reasonable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Insulated rail joints (IRJs) are a primary component of the rail track safety and signalling systems. Rails are supported by two fishplates which are fastened by bolts and nuts and, with the support of sleepers and track ballast, form an integrated assembly. IRJ failure can result from progressive defects, the propagation of which is influenced by residual stresses in the rail. Residual stresses change significantly during service due to the complex deformation and damage effects associated with wheel rolling, sliding and impact. IRJ failures can occur when metal flows over the insulated rail gap (typically 6-8 mm width), breaks the electrically isolated section of track and results in malfunction of the track signalling system. In this investigation, residual stress measurements were obtained from rail-ends which had undergone controlled amounts of surface plastic deformation using a full scale wheel-on-track simulation test rig. Results were compared with those obtained from similar investigations performed on rail ends associated with ex-service IRJs. Residual stresses were measured by neutron diffraction at the Australian Nuclear Science and Technology Organisation (ANSTO). Measurements with constant gauge volume 3x3x3 mm3 were carried in the central vertical plane on 5mm thick sliced rail samples cut by an electric discharge machine (EDM). Stress evolution at the rail ends was found to exhibit characteristics similar to those of the ex-service rails, with a compressive zone of 5mm deep that is counterbalanced by a tension zone beneath, extending to a depth of around 15mm. However, in contrast to the ex-service rails, the type of stress distribution in the test-rig deformed samples was apparently different due to the localization of load under the particular test conditions. In the latter, in contrast with clear stress evolution, there was no obvious evolution of d0. Since d0 reflects rather long-term accumulation of crystal lattice damage and microstructural changes due to service load, the loading history of the test rig samples has not reached the same level as the ex-service rails. It is concluded that the wheel-on-rail simulation rig provides the potential capability for testing the wheel-rail rolling contact conditions in rails, rail ends and insulated rail joints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatially-explicit modelling of grassland classes is important to site-specific planning for improving grassland and environmental management over large areas. In this study, a climate-based grassland classification model, the Comprehensive and Sequential Classification System (CSCS) was integrated with spatially interpolated climate data to classify grassland in Gansu province, China. The study area is characterized by complex topographic features imposed by plateaus, high mountains, basins and deserts. To improve the quality of the interpolated climate data and the quality of the spatial classification over this complex topography, three linear regression methods, namely an analytic method based on multiple regression and residues (AMMRR), a modification of the AMMRR method through adding the effect of slope and aspect to the interpolation analysis (M-AMMRR) and a method which replaces the IDW approach for residue interpolation in M-AMMRR with an ordinary kriging approach (I-AMMRR), for interpolating climate variables were evaluated. The interpolation outcomes from the best interpolation method were then used in the CSCS model to classify the grassland in the study area. Climate variables interpolated included the annual cumulative temperature and annual total precipitation. The results indicated that the AMMRR and M-AMMRR methods generated acceptable climate surfaces but the best model fit and cross validation result were achieved by the I-AMMRR method. Twenty-six grassland classes were classified for the study area. The four grassland vegetation classes that covered more than half of the total study area were "cool temperate-arid temperate zonal semi-desert", "cool temperate-humid forest steppe and deciduous broad-leaved forest", "temperate-extra-arid temperate zonal desert", and "frigid per-humid rain tundra and alpine meadow". The vegetation classification map generated in this study provides spatial information on the locations and extents of the different grassland classes. This information can be used to facilitate government agencies' decision-making in land-use planning and environmental management, and for vegetation and biodiversity conservation. The information can also be used to assist land managers in the estimation of safe carrying capacities which will help to prevent overgrazing and land degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This chapter addresses data modelling as a means of promoting statistical literacy in the early grades. Consideration is first given to the importance of increasing young children’s exposure to statistical reasoning experiences and how data modelling can be a rich means of doing so. Selected components of data modelling are then reviewed, followed by a report on some findings from the third-year of a three-year longitudinal study across grades one through three.