760 resultados para empirical data


Relevância:

60.00% 60.00%

Publicador:

Resumo:

During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The policies and regulations governing the practice of state asset management have emerged as an urgent question among many countries worldwide for there is heightened awareness of the complex and crucial role that state assets play in public service provision. Indonesia is an example of such country, introducing a ‘big-bang’ reform in state asset management laws, policies, regulations, and technical guidelines. Indonesia exemplified its enthusiasm in reforming state asset management policies and practices through the establishment of the Directorate General of State Assets in 2006. The Directorate General of State Assets have stressed the new direction that it is taking state asset management laws and policies through the introduction of Republic of Indonesia Law Number 38 Year 2008, which is an amended regulation overruling Republic of Indonesia Law Number 6 Year 2006 on Central/Regional Government State Asset Management. Law number 38/2008 aims to further exemplify good governance principles and puts forward a ‘the highest and best use of assets’ principle in state asset management. The purpose of this study is to explore and analyze specific contributing influences to state asset management practices, answering the question why innovative state asset management policy implementation is stagnant. The methodology of this study is that of qualitative case study approach, utilizing empirical data sample of four Indonesian regional governments. Through a thematic analytical approach this study provides an in-depth analysis of each influencing factors to state asset management reform. Such analysis suggests the potential of an ‘excuse rhetoric’; whereby the influencing factors identified are a smoke-screen, or are myths that public policy makers and implementers believe in, as a means to ex-plain stagnant implementation of innovative state asset management practice. Thus this study offers deeper insights of the intricate web that influences state as-set management innovative policies to state asset management policy makers; to be taken into consideration in future policy writing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2014 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We continued the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Public sector organisations (PSOs) operate in information-intensive environments often within operational contexts where efficiency is a goal. What's more, the rapid adoption of IT is expected to facilitate good governance within public sector organisations but it often clashes with the bureaucratic culture of these organisations. Accordingly, models such as IT Governance (ITG) and government reform -in particular the new public management (NPM)- were introduced in PSOs in an effort to address the inefficiencies of bureaucracy and under performance. This work explores the potential effect of change in political direction and policy on the stability of IT governance in Australian public sector organisations. The aim of this paper is to examine implications of a change of government and the resulting political environment on the effectiveness of the audit function of ITG. The empirical data discussed here indicate that a number of aspects of audit functionality were negatively affected by change in political direction and resultant policy changes. The results indicate a perceived decline in capacity and capability which in turn disrupts the stability of IT governance systems in public sector organisations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper investigates the outsourcing of income tax return preparation by Australian accounting firms. It identifies the extent to which firms are currently outsourcing accounting services or considering outsourcing accounting services, with a focus on personal and business income tax return preparation. The motivations and barriers for outsourcing by Australian accounting firms are also considered in this paper. Privacy, security of client data, and the competence of the outsourcing provider's staff have been identified as risks associated with outsourcing. An expectation relating to confidentiality of client data is also examined in this paper. Statistical analysis of data collected from a random sample of Australian accounting firms using a survey questionnaire provided the empirical data for the paper. The results indicate that the majority of Australian accounting firms are either currently outsourcing or considering outsourcing accounting services, and firms are outsourcing taxation preparation both onshore and offshore. The results also indicate that firms expect the volume of outsourced work to increase in the future. In contrast to the literature identifying labour arbitrage as the primary driver for organisations choosing to outsource, this study found that the main factors considered by accounting firms in the decision to outsource were to expedite delivery of services to clients and to enable the firm to focus on core competencies. Data from this study also supports the literature which ndicates that not all tax practitioners are adhering to codes of conduct in relation to client confidentiality. Research identifying the extent to which accounting services are outsourced is limited, therefore significant contributions to the academic literature and the accounting profession are provided by this ndicates that not all tax practitioners are adhering to codes of conduct in relation to client confidentiality. Research identifying the extent to which accounting services are outsourced is limited, therefore significant contributions to the academic literature and the accounting profession are provided by this study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This chapter addresses the radical paucity of empirical data about the career destinations of journalism, media and communications graduates from degree programs. We report findings from a study of ten years of graduates from Queensland University of Technology’s courses in journalism, media, and communication studies, using a ‘Creative Trident’ lens to analyse micro individual survey data. The study findings engage with creative labour precarity discussions, and also assertions of creative graduate oversupply suggested by national graduate outcome statistics. We describe the graduates’ employment outcomes, characterise their early career movements into and out of embedded and specialist employment, and compare the capability requirements and degree of course relevance reported by graduates employed in the different Trident segments. Given that in general the graduates in this study enjoyed very positive employment outcomes, but that there were systematic differences in reported course relevance by segment of employment and role, we also consider how university programs can best engage with the task of educating students for a surprisingly diverse range of media and communication-related occupational outcomes within and outside the creative industries.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Enterprise Systems purport to bring innovation to organizations. Yet, no past studies, neither from innovation nor from ES disciplines have merged their knowledge to understand how ES could facilitate lifecycle-wide innovation. Therefore, this study forms conceptual bridge between the two disciplines. In this research, we seek to understand how ES could facilitate innovation across its lifecycle phases. We associate classifications of innovation such as radical vs. incremental, administrative vs. technical innovation with the three phases of ES lifecycle. We introduce Continuous Restrained Innovation (CRI) as a new type of innovation specific to ES, considering restraints of technology, business processes and organization. Our empirical data collection at the implementation phase, using data from both the client and implementation partner, shows preliminary evidence of CRI. In addition, we state that both parties consider the implementation of ES as a radical innovation yet, are less interest in seeking further innovations through the system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ulrich Beck's argument about risk society emphasises, among other things, the pervasiveness of risk. As a feature of the human condition in the contemporary, globalised, world that distinguishes the present from the past, risk is widespread across society and affects all social strata. While Beck has gestured towards the irregular distribution of contemporary risks, nonetheless he has suggested that traditional structural entities – class and wealth – no longer provide the key interpretive frameworks for the calculation of susceptibility. In short, the tentacles of risk are long and almost no one is out of reach. Yet, while the risk society thesis has generated a large theoretical literature, there is very little in the way of research that marries theorising to original data collection. This paper represents an attempt to address this gap by using empirical data to investigate whether risk is more textured than Beck's account suggests. Focusing on health as a domain of risk, the paper uses data from a national sample survey of the Australian electorate to investigate the extent to which social divisions structure perceptions of risk within the general population. The findings suggest that various aspects of social stratification, such as income, occupation and education, do indeed play a role in shaping perceptions of risk.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A key concept in many Information Retrieval (IR) tasks, e.g. document indexing, query language modelling, aspect and diversity retrieval, is the relevance measurement of topics, i.e. to what extent an information object (e.g. a document or a query) is about the topics. This paper investigates the interference of relevance measurement of a topic caused by another topic. For example, consider that two user groups are required to judge whether a topic q is relevant to a document d, and q is presented together with another topic (referred to as a companion topic). If different companion topics are used for different groups, interestingly different relevance probabilities of q given d can be reached. In this paper, we present empirical results showing that the relevance of a topic to a document is greatly affected by the companion topic’s relevance to the same document, and the extent of the impact differs with respect to different companion topics. We further analyse the phenomenon from classical and quantum-like interference perspectives, and connect the phenomenon to nonreality and contextuality in quantum mechanics. We demonstrate that quantum like model fits in the empirical data, could be potentially used for predicting the relevance when interference exists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The macroscopic fundamental diagram (MFD) traffic modelling method has been proved for large urban roads and freeway networks, but hysteresis and scatter have been found in both such networks. This paper investigates how incident variables affect the shape and scatter of the MFD using both simulated data and real data collected from the M3 Pacific motorway in Brisbane, Australia. Three key components of incidents are investigated based on the simulated data (i.e. incident location, incident duration and traffic demand). The results based on simulated data indicate that the diagram shape is a property not only of the network itself but also of the incident variables. Diagrams for three types of real incidents (crash, hazard and vehicle breakdown) are explored separately. The results based on the empirical data are consistent with the simulated results. The hysteresis phenomenon occurs both upstream and downstream of the incident location, but for opposite hysteresis loops. The gradient of the upstream diagram is greater than that downstream on the incident site, when traffic demand is for an off-peak period.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Anxiety disorders are increasingly acknowledged as a global health issue however an accurate picture of prevalence across populations is lacking. Empirical data are incomplete and inconsistent so alternate means of estimating prevalence are required to inform estimates for the new Global Burden of Disease Study 2010. We used a Bayesian meta-regression approach which included empirical epidemiological data, expert prior information, study covariates and population characteristics. Reported are global and regional point prevalence for anxiety disorders in 2010. Point prevalence of anxiety disorders differed by up to three-fold across world regions, ranging between 2.1% (1.8-2.5%) in East Asia and 6.1% (5.1-7.4%) in North Africa/Middle East. Anxiety was more common in Latin America; high income regions; and regions with a history of recent conflict. There was considerable uncertainty around estimates, particularly for regions where no data were available. Future research is required to examine whether variations in regional distributions of anxiety disorders are substantive differences or an artefact of cultural or methodological differences. This is a particular imperative where anxiety is consistently reported to be less common, and where it appears to be elevated, but uncertainty prevents the reporting of conclusive estimates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Although the detrimental impact of major depressive disorder (MDD) at the individual level has been described, its global epidemiology remains unclear given limitations in the data. Here we present the modelled epidemiological profile of MDD dealing with heterogeneity in the data, enforcing internal consistency between epidemiological parameters and making estimates for world regions with no empirical data. These estimates were used to quantify the burden of MDD for the Global Burden of Disease Study 2010 (GBD 2010). Method Analyses drew on data from our existing literature review of the epidemiology of MDD. DisMod-MR, the latest version of the generic disease modelling system redesigned as a Bayesian meta-regression tool, derived prevalence by age, year and sex for 21 regions. Prior epidemiological knowledge, study- and country-level covariates adjusted sub-optimal raw data. Results There were over 298 million cases of MDD globally at any point in time in 2010, with the highest proportion of cases occurring between 25 and 34 years. Global point prevalence was very similar across time (4.4% (95% uncertainty: 4.2–4.7%) in 1990, 4.4% (4.1–4.7%) in 2005 and 2010), but higher in females (5.5% (5.0–6.0%) compared to males (3.2% (3.0–3.6%) in 2010. Regions in conflict had higher prevalence than those with no conflict. The annual incidence of an episode of MDD followed a similar age and regional pattern to prevalence but was about one and a half times higher, consistent with an average duration of 37.7 weeks. Conclusion We were able to integrate available data, including those from high quality surveys and sub-optimal studies, into a model adjusting for known methodological sources of heterogeneity. We were also able to estimate the epidemiology of MDD in regions with no available data. This informed GBD 2010 and the public health field, with a clearer understanding of the global distribution of MDD.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Up-to-date evidence on levels and trends for age-sex-specific all-cause and cause-specific mortality is essential for the formation of global, regional, and national health policies. In the Global Burden of Disease Study 2013 (GBD 2013) we estimated yearly deaths for 188 countries between 1990, and 2013. We used the results to assess whether there is epidemiological convergence across countries. Methods We estimated age-sex-specific all-cause mortality using the GBD 2010 methods with some refinements to improve accuracy applied to an updated database of vital registration, survey, and census data. We generally estimated cause of death as in the GBD 2010. Key improvements included the addition of more recent vital registration data for 72 countries, an updated verbal autopsy literature review, two new and detailed data systems for China, and more detail for Mexico, UK, Turkey, and Russia. We improved statistical models for garbage code redistribution. We used six different modelling strategies across the 240 causes; cause of death ensemble modelling (CODEm) was the dominant strategy for causes with sufficient information. Trends for Alzheimer's disease and other dementias were informed by meta-regression of prevalence studies. For pathogen-specific causes of diarrhoea and lower respiratory infections we used a counterfactual approach. We computed two measures of convergence (inequality) across countries: the average relative difference across all pairs of countries (Gini coefficient) and the average absolute difference across countries. To summarise broad findings, we used multiple decrement life-tables to decompose probabilities of death from birth to exact age 15 years, from exact age 15 years to exact age 50 years, and from exact age 50 years to exact age 75 years, and life expectancy at birth into major causes. For all quantities reported, we computed 95% uncertainty intervals (UIs). We constrained cause-specific fractions within each age-sex-country-year group to sum to all-cause mortality based on draws from the uncertainty distributions. Findings Global life expectancy for both sexes increased from 65·3 years (UI 65·0–65·6) in 1990, to 71·5 years (UI 71·0–71·9) in 2013, while the number of deaths increased from 47·5 million (UI 46·8–48·2) to 54·9 million (UI 53·6–56·3) over the same interval. Global progress masked variation by age and sex: for children, average absolute differences between countries decreased but relative differences increased. For women aged 25–39 years and older than 75 years and for men aged 20–49 years and 65 years and older, both absolute and relative differences increased. Decomposition of global and regional life expectancy showed the prominent role of reductions in age-standardised death rates for cardiovascular diseases and cancers in high-income regions, and reductions in child deaths from diarrhoea, lower respiratory infections, and neonatal causes in low-income regions. HIV/AIDS reduced life expectancy in southern sub-Saharan Africa. For most communicable causes of death both numbers of deaths and age-standardised death rates fell whereas for most non-communicable causes, demographic shifts have increased numbers of deaths but decreased age-standardised death rates. Global deaths from injury increased by 10·7%, from 4·3 million deaths in 1990 to 4·8 million in 2013; but age-standardised rates declined over the same period by 21%. For some causes of more than 100 000 deaths per year in 2013, age-standardised death rates increased between 1990 and 2013, including HIV/AIDS, pancreatic cancer, atrial fibrillation and flutter, drug use disorders, diabetes, chronic kidney disease, and sickle-cell anaemias. Diarrhoeal diseases, lower respiratory infections, neonatal causes, and malaria are still in the top five causes of death in children younger than 5 years. The most important pathogens are rotavirus for diarrhoea and pneumococcus for lower respiratory infections. Country-specific probabilities of death over three phases of life were substantially varied between and within regions. Interpretation For most countries, the general pattern of reductions in age-sex specific mortality has been associated with a progressive shift towards a larger share of the remaining deaths caused by non-communicable disease and injuries. Assessing epidemiological convergence across countries depends on whether an absolute or relative measure of inequality is used. Nevertheless, age-standardised death rates for seven substantial causes are increasing, suggesting the potential for reversals in some countries. Important gaps exist in the empirical data for cause of death estimates for some countries; for example, no national data for India are available for the past decade.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The hippocampus is an anatomically distinct region of the medial temporal lobe that plays a critical role in the formation of declarative memories. Here we show that a computer simulation of simple compartmental cells organized with basic hippocampal connectivity is capable of producing stimulus intensity sensitive wide-band fluctuations of spectral power similar to that seen in real EEG. While previous computational models have been designed to assess the viability of the putative mechanisms of memory storage and retrieval, they have generally been too abstract to allow comparison with empirical data. Furthermore, while the anatomical connectivity and organization of the hippocampus is well defined, many questions regarding the mechanisms that mediate large-scale synaptic integration remain unanswered. For this reason we focus less on the specifics of changing synaptic weights and more on the population dynamics. Spectral power in four distinct frequency bands were derived from simulated field potentials of the computational model and found to depend on the intensity of a random input. The majority of power occurred in the lowest frequency band (3-6 Hz) and was greatest to the lowest intensity stimulus condition (1% maximal stimulus). In contrast, higher frequency bands ranging from 7-45 Hz show an increase in power directly related with an increase in stimulus intensity. This trend continues up to a stimulus level of 15% to 20% of the maximal input, above which power falls dramatically. These results suggest that the relative power of intrinsic network oscillations are dependent upon the level of activation and that above threshold levels all frequencies are damped, perhaps due to over activation of inhibitory interneurons.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.