315 resultados para Data-driven modelling
Resumo:
During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.
Resumo:
In this study, the mixed convection heat transfer and fluid flow behaviors in a lid-driven square cavity filled with high Prandtl number fluid (Pr = 5400, ν = 1.2×10-4 m2/s) at low Reynolds number is studied using thermal Lattice Boltzmann method (TLBM) where ν is the viscosity of the fluid. The LBM has built up on the D2Q9 model and the single relaxation time method called the Lattice-BGK (Bhatnagar-Gross-Krook) model. The effects of the variations of non dimensional mixed convection parameter called Richardson number(Ri) with and without heat generating source on the thermal and flow behavior of the fluid inside the cavity are investigated. The results are presented as velocity and temperature profiles as well as stream function and temperature contours for Ri ranging from 0.1 to 5.0 with other controlling parameters that present in this study. It is found that LBM has good potential to simulate mixed convection heat transfer and fluid flow problem. Finally the simulation results have been compared with the previous numerical and experimental results and it is found to be in good agreement.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.
Resumo:
Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.
Resumo:
The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.
Resumo:
Spatially-explicit modelling of grassland classes is important to site-specific planning for improving grassland and environmental management over large areas. In this study, a climate-based grassland classification model, the Comprehensive and Sequential Classification System (CSCS) was integrated with spatially interpolated climate data to classify grassland in Gansu province, China. The study area is characterized by complex topographic features imposed by plateaus, high mountains, basins and deserts. To improve the quality of the interpolated climate data and the quality of the spatial classification over this complex topography, three linear regression methods, namely an analytic method based on multiple regression and residues (AMMRR), a modification of the AMMRR method through adding the effect of slope and aspect to the interpolation analysis (M-AMMRR) and a method which replaces the IDW approach for residue interpolation in M-AMMRR with an ordinary kriging approach (I-AMMRR), for interpolating climate variables were evaluated. The interpolation outcomes from the best interpolation method were then used in the CSCS model to classify the grassland in the study area. Climate variables interpolated included the annual cumulative temperature and annual total precipitation. The results indicated that the AMMRR and M-AMMRR methods generated acceptable climate surfaces but the best model fit and cross validation result were achieved by the I-AMMRR method. Twenty-six grassland classes were classified for the study area. The four grassland vegetation classes that covered more than half of the total study area were "cool temperate-arid temperate zonal semi-desert", "cool temperate-humid forest steppe and deciduous broad-leaved forest", "temperate-extra-arid temperate zonal desert", and "frigid per-humid rain tundra and alpine meadow". The vegetation classification map generated in this study provides spatial information on the locations and extents of the different grassland classes. This information can be used to facilitate government agencies' decision-making in land-use planning and environmental management, and for vegetation and biodiversity conservation. The information can also be used to assist land managers in the estimation of safe carrying capacities which will help to prevent overgrazing and land degradation.
Resumo:
Management of the industrial nations' hazardous waste is a current and exponentially increasing, global threatening situation. Improved environmental information must be obtained and managed concerning the current status, temporal dynamics and potential future status of these critical sites. To test the application of spatial environmental techniques to the problem of hazardous waste sites, as Superfund (CERCLA) test site was chosen in an industrial/urban valley experiencing severe TCE, PCE, and CTC ground water contamination. A paradigm is presented for investigating spatial/environmental tools available for the mapping, monitoring and modelling of the environment and its toxic contaminated plumes. This model incorporates a range of technical issues concerning the collection of data as augmented by remotely sensed tools, the format and storage of data utilizing geographic information systems, and the analysis and modelling of environment through the use of advance GIS analysis algorithms and geophysic models of hydrologic transport including statistical surface generation. This spatial based approach is evaluated against the current government/industry standards of operations. Advantages and lessons learned of the spatial approach are discussed.
Resumo:
Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. Keywords: Climate; Dengue; Models; Projection; Scenarios
Resumo:
Chronic leg ulcers are costly to manage for health service providers. Although evidence-based care leads to improved healing rates and reduced costs, a significant evidence-practice gap is known to exist. Lack of access to specialist skills in wound care is one reason suggested for this gap. The aim of this study was to model the change to total costs and health outcomes under two versions of health services for patients with leg ulcers: routine health services for community-living patients; and care provided by specialist wound clinics. Mean weekly treatment and health services costs were estimated from participants’ data (n=70) for the twelve months prior to their entry to a study specialist wound clinic, and prospectively for 24 weeks after entry. For the retrospective phase mean weekly costs of care were $AU130.30 (SD $12.64) and these fell to $AU53.32 (SD $6.47) for the prospective phase. Analysis at a population level suggests if 10,000 individuals receive 12 weeks of specialist evidence-based care, the cost savings are likely to be AU$9,238,800. Significant savings could be made by the adoption of evidence-based care such as that provided by the community and outpatient specialist wound clinics in this study.
Resumo:
The motion response of marine structures in waves can be studied using finite-dimensional linear-time-invariant approximating models. These models, obtained using system identification with data computed by hydrodynamic codes, find application in offshore training simulators, hardware-in-the-loop simulators for positioning control testing, and also in initial designs of wave-energy conversion devices. Different proposals have appeared in the literature to address the identification problem in both time and frequency domains, and recent work has highlighted the superiority of the frequency-domain methods. This paper summarises practical frequency-domain estimation algorithms that use constraints on model structure and parameters to refine the search of approximating parametric models. Practical issues associated with the identification are discussed, including the influence of radiation model accuracy in force-to-motion models, which are usually the ultimate modelling objective. The illustration examples in the paper are obtained using a freely available MATLAB toolbox developed by the authors, which implements the estimation algorithms described.
Resumo:
A method is proposed to describe force or compound muscle action potential (CMAP) trace data collected in an electromyography study for motor unit number estimation (MUNE). Experimental data was collected using incre- mental stimulation at multiple durations. However, stimulus information, vital for alternate MUNE methods, is not comparable for multiple duration data and therefore previous methods of MUNE (Ridall et al., 2006, 2007) cannot be used with any reliability. Hypothesised ring combinations of motor units are mod- elled using a multiplicative factor and Bayesian P-spline formulation. The model describes the process for force and CMAP in a meaningful way.
Resumo:
In this paper, we explore how BIM functionalities together with novel management concepts and methods have been utilized in thirteen hospital projects in the United States and the United Kingdom. Secondary data collection and analysis were used as the method. Initial findings indicate that the utilization of BIM enables a holistic view of project delivery and helps to integrate project parties into a collaborative process. The initiative to implement BIM must come from the top down to enable early involvement of all key stakeholders. It seems that it is rather resistance from people to adapt to the new way of working and thinking than immaturity of technology that hinders the utilization of BIM.
Resumo:
Health care systems are highly dynamic not just due to developments and innovations in diagnosis and treatments, but also by virtue of emerging management techniques supported by modern information and communication technology. A multitude of stakeholders such as patients, nurses, general practitioners or social carers can be integrated by modeling complex interactions necessary for managing the provision and consumption of health care services. Furthermore, it is the availability of Service-oriented Architecture (SOA) that supports those integration efforts by enabling the flexible and reusable composition of autonomous, loosely-coupled and web-enabled software components. However, there is still the gap between SOA and predominantly business-oriented perspectives (e.g. business process models). The alignment of both views is crucial not just for the guided development of SOA but also for the sustainable evolution of holistic enterprise architectures. In this paper, we combine the Semantic Object Model (SOM) and the Business Process Modelling Notation (BPMN) towards a model-driven approach to service engineering. By addressing a business system in Home Telecare and deriving a business process model, which can eventually be controlled and executed by machines; in particular by composed web services, the full potential of a process-centric SOA is exploited.
Resumo:
Passenger experience has become a major factor that influences the success of an airport. In this context, passenger flow simulation has been used in designing and managing airports. However, most passenger flow simulations failed to consider the group dynamics when developing passenger flow models. In this paper, an agent-based model is presented to simulate passenger behaviour at the airport check-in and evacuation process. The simulation results show that the passenger behaviour can have significant influences on the performance and utilisation of services in airport terminals. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Resumo:
A bioeconomic model was developed to evaluate the potential performance of brown tiger prawn stock enhancement in Exmouth Gulf, Australia. This paper presents the framework for the bioeconomic model and risk assessment for all components of a stock enhancement operation, i.e. hatchery, grow-out, releasing, population dynamics, fishery, and monitoring, for a commercial scale enhancement of about 100 metric tonnes, a 25% increase in average annual catch in Exmouth Gulf. The model incorporates uncertainty in estimates of parameters by using a distribution for the parameter over a certain range, based on experiments, published data, or similar studies. Monte Carlo simulation was then used to quantify the effects of these uncertainties on the model-output and on the economic potential of a particular production target. The model incorporates density-dependent effects in the nursery grounds of brown tiger prawns. The results predict that a release of 21 million 1 g prawns would produce an estimated enhanced prawn catch of about 100 t. This scale of enhancement has a 66.5% chance of making a profit. The largest contributor to the overall uncertainty of the enhanced prawn catch was the post-release mortality, followed by the density-dependent mortality caused by released prawns. These two mortality rates are most difficult to estimate in practice and are much under-researched in stock enhancement.