723 resultados para Business Intelligence, ETL, Data Warehouse, Metadati, Reporting


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big data analysis in healthcare sector is still in its early stages when comparing with that of other business sectors due to numerous reasons. Accommodating the volume, velocity and variety of healthcare data Identifying platforms that examine data from multiple sources, such as clinical records, genomic data, financial systems, and administrative systems Electronic Health Record (EHR) is a key information resource for big data analysis and is also composed of varied co-created values. Successful integration and crossing of different subfields of healthcare data such as biomedical informatics and health informatics could lead to huge improvement for the end users of the health care system, i.e. the patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital technology offers enormous benefits (economic, quality of design and efficiency in use) if adopted to implement integrated ways of representing the physical world in a digital form. When applied across the full extent of the built and natural world, it is referred to as the Digital Built Environment (DBE) and encompasses a wide range of approaches and technology initiatives, all aimed at the same end goal: the development of a virtual world that sufficiently mirrors the real world to form the basis for the smart cities of the present and future, enable efficient infrastructure design and programmed maintenance, and create a new foundation for economic growth and social well-being through evidence-based analysis. The creation of a National Data Policy for the DBE will facilitate the creation of additional high technology industries in Australia; provide Governments, industries and citizens with greater knowledge of the environments they occupy and plan; and offer citizen-driven innovations for the future. Australia has slipped behind other nations in the adoption and execution of Building Information Modelling (BIM) and the principal concern is that the gap is widening. Data driven innovation added $67 billion to the Australian economy in 20131. Strong open data policy equates to $16 billion in new value2. Australian Government initiatives such as the Digital Earth inspired “National Map” offer a platform and pathway to embrace the concept of a “BIM Globe”, while also leveraging unprecedented growth in open source / open data collaboration. Australia must address the challenges by learning from international experiences—most notably the UK and NZ—and mandate the use of BIM across Government, extending the Framework for Spatial Data Foundation to include the Built Environment as a theme and engaging collaboration through a “BIM globe” metaphor. This proposed DBE strategy will modernise the Australian urban planning and the construction industry. It will change the way we develop our cities by fundamentally altering the dynamics and behaviours of the supply chains and unlocking new and more efficient ways of collaborating at all stages of the project life-cycle. There are currently two major modelling approaches that contribute to the challenge of delivering the DBE. Though these collectively encompass many (often competing) approaches or proprietary software systems, all can be categorised as either: a spatial modelling approach, where the focus is generally on representing the elements that make up the world within their geographic context; and a construction modelling approach, where the focus is on models that support the life cycle management of the built environment. These two approaches have tended to evolve independently, addressing two broad industry sectors: the one concerned with understanding and managing global and regional aspects of the world that we inhabit, including disciplines concerned with climate, earth sciences, land ownership, urban and regional planning and infrastructure management; the other is concerned with planning, design, construction and operation of built facilities and includes architectural and engineering design, product manufacturing, construction, facility management and related disciplines (a process/technology commonly known as Building Information Modelling, BIM). The spatial industries have a strong voice in the development of public policy in Australia, while the construction sector, which in 2014 accounted for around 8.5% of Australia’s GDP3, has no single voice and because of its diversity, is struggling to adapt to and take advantage of the opportunity presented by these digital technologies. The experience in the UK over the past few years has demonstrated that government leadership is very effective in stimulating industry adoption of digital technologies by, on the one hand, mandating the use of BIM on public procurement projects while at the same time, providing comparatively modest funding to address the common issues that confront the industry in adopting that way of working across the supply chain. The reported result has been savings of £840m in construction costs in 2013/14 according to UK Cabinet Office figures4. There is worldwide recognition of the value of bringing these two modelling technologies together. Australia has the expertise to exercise leadership in this work, but it requires a commitment by government to recognise the importance of BIM as a companion methodology to the spatial technologies so that these two disciplinary domains can cooperate in the development of data policies and information exchange standards to smooth out common workflows. buildingSMART Australasia, SIBA and their academic partners have initiated this dialogue in Australia and wish to work collaboratively, with government support and leadership, to explore the opportunities open to us as we develop an Australasian Digital Built Environment. As part of that programme, we must develop and implement a strategy to accelerate the adoption of BIM processes across the Australian construction sector while at the same time, developing an integrated approach in concert with the spatial sector that will position Australia at the forefront of international best practice in this area. Australia and New Zealand cannot afford to be on the back foot as we face the challenges of rapid urbanisation and change in the global environment. Although we can identify some exemplary initiatives in this area, particularly in New Zealand in response to the need for more resilient urban development in the face of earthquake threats, there is still much that needs to be done. We are well situated in the Asian region to take a lead in this challenge, but we are at imminent risk of losing the initiative if we do not take action now. Strategic collaboration between Governments, Industry and Academia will create new jobs and wealth, with the potential, for example, to save around 20% on the delivery costs of new built assets, based on recent UK estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years accounting education has seen numerous changes to the way financial accounting is taught. These changes reflect the demands of an ever-changing business world, opportunities created by new technology and instructional technologies, and an increased understanding of how students learn. The foundation of Financial Accounting is based on a number of unique principles and innovations in accounting education. The objective of Financial Accounting is to provide students with an understanding of those concepts that are fundamental to the preparation and use of accounting information. Most students will forget procedural details within a short period of time. On the other hand, concepts, if well taught, should be remembered for a lifetime. Concepts are especially important in a world where the details are constantly changing. Students learn best when they are actively engaged. The overriding pedagogical objective of Financial Accounting is to provide students with continual opportunities for active learning. One of the best tools for active learning is strategically placed questions. Discussions are framed by questions, often beginning with rhetorical questions and ending with review questions, and our analytical devices, called decision-making toolkits, use key questions to demonstrate the purpose of each.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid increase in migration into host countries and the growth of immigrant-owned business enterprises has revitalized research on ethnic business. Does micro (individual)-level social capital, or meso (group)-level location within the ethnic enclave lead to immigrant business growth? Or do you need both? We analyze quantitative data collected from 110 Chinese restaurants in Australia, a major host country. At the micro level we find that coethnic (same ethnic group) networks are critical to the growth of an immigrant entrepreneur's business, particularly in the early years. But non-coethnic (different ethnic group) social capital only has a positive impact on business growth for immigrant businesses outside the ethnic enclave. Our findings are relevant, not only to host-country policymakers, but also for future immigrant business owners and ethnic community leaders trying to better understand how to promote healthy communities and sustainable economic growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The present study evaluates the prehospital care of paediatric burn patients in Queensland (QLD). As first aid (FA) treatment has been shown to affect burn progression and outcome, the FA treatment and the risk of associated hypothermia in paediatric patients were specifically examined in the context of paramedic management of burn patients. METHODS: Data were retrospectively collected from electronic ambulance response forms (eARFs) for paediatric burn patients (0-5 years) who were attended by Queensland Ambulance Service (QAS) from 2008 to 2010. Data were collected from 117 eARFs of incidents occurring within the Brisbane, Townsville and Cairns regions. RESULTS: Initial FA measures were recorded in 77.8% of cases, with cool running water FA administered in 56.4% of cases. The duration of FA was recorded in 29.9% of reports. The duration of FA was significantly shorter for patients in Northern QLD (median = 10 min, n = 10) compared with Brisbane (median = 15 min, n = 18), P = 0.005. Patient temperatures were recorded significantly more often in Brisbane than in other regions (P = 0.041); however, in total, only 24.8% of all patients had documented temperature readings. Of these, six (5%) were recorded as having temperatures ≤ 36.0°C. Burnaid(TM) was the most commonly used dressing and was applied to 55.6% of all patients; however, it was applied with a variety of different outer dressings. Brisbane paramedics applied Burnaid significantly less often (44.3%) compared with paramedics from Northern QLD (72.7%) and Far Northern QLD (60.9%), P = 0.025. CONCLUSIONS: Despite FA and patient temperatures being important prognostic factors for burn patients, paramedic documentation of these was often incomplete, and there was no consistent use of burns dressings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores consumer behavioural patterns on a magazine website. By using a unique dataset of real-life click stream data from 295 magazine website visitors, individual sessions are grouped according to the different sections visited on the websites. Interesting behavioural patterns are noted: most importantly, 86 % of all sessions only visit the blogs. This means that the visitors are not exposed to any editorial content at all, and choose to avoid also commercial contents. Sessions visiting editorial content, commercial content or social media links are very few in numbers (each 1 per cent or less of the sessions), thus giving only very limited support to the magazine business model. We noted that consumer behaviour on the magazine website seems to be very goal-oriented and instrumental, rather than exploratory and ritualized. This paper contributes to the current knowledge of media management by shedding light on consumer behaviour on media websites, and opening up the challenges with current media business models. From a more practical perspective, our data questions the general assumption of online platforms as supporter of the print business.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Schizophrenia is associated with lower pre-morbid intelligence (IQ) in addition to (pre-morbid) cognitive decline. Both schizophrenia and IQ are highly heritable traits. Therefore, we hypothesized that genetic variants associated with schizophrenia, including copy number variants (CNVs) and a polygenic schizophrenia (risk) score (PSS), may influence intelligence. Method IQ was estimated with the Wechsler Adult Intelligence Scale (WAIS). CNVs were determined from single nucleotide polymorphism (SNP) data using the QuantiSNP and PennCNV algorithms. For the PSS, odds ratios for genome-wide SNP data were calculated in a sample collected by the Psychiatric Genome-Wide Association Study (GWAS) Consortium (8690 schizophrenia patients and 11 831 controls). These were used to calculate individual PSSs in our independent sample of 350 schizophrenia patients and 322 healthy controls. Results Although significantly more genes were disrupted by deletions in schizophrenia patients compared to controls (p = 0.009), there was no effect of CNV measures on IQ. The PSS was associated with disease status (R 2 = 0.055, p = 2.1 × 10 -7) and with IQ in the entire sample (R 2 = 0.018, p = 0.0008) but the effect on IQ disappeared after correction for disease status. Conclusions Our data suggest that rare and common schizophrenia-associated variants do not explain the variation in IQ in healthy subjects or in schizophrenia patients. Thus, reductions in IQ in schizophrenia patients may be secondary to other processes related to schizophrenia risk. © Cambridge University Press 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big Data and predictive analytics have received significant attention from the media and academic literature throughout the past few years, and it is likely that these emerging technologies will materially impact the mining sector. This short communication argues, however, that these technological forces will probably unfold differently in the mining industry than they have in many other sectors because of significant differences in the marginal cost of data capture and storage. To this end, we offer a brief overview of what Big Data and predictive analytics are, and explain how they are bringing about changes in a broad range of sectors. We discuss the “N=all” approach to data collection being promoted by many consultants and technology vendors in the marketplace but, by considering the economic and technical realities of data acquisition and storage, we then explain why a “n « all” data collection strategy probably makes more sense for the mining sector. Finally, towards shaping the industry’s policies with regards to technology-related investments in this area, we conclude by putting forward a conceptual model for leveraging Big Data tools and analytical techniques that is a more appropriate fit for the mining sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years, the focus in road safety has shifted towards a greater understanding of road crash serious injuries in addition to fatalities. Police reported crash data are often the primary source of crash information; however, the definition of serious injury within these data is not consistent across jurisdictions and may not be accurately operationalised. This study examined the linkage of police-reported road crash data with hospital data to explore the potential for linked data to enhance the quantification of serious injury. Data from the Queensland Road Crash Database (QRCD), the Queensland Hospital Admitted Patients Data Collection (QHAPDC), Emergency Department Information System (EDIS), and the Queensland Injury Surveillance Unit (QISU) for the year 2009 were linked. Nine different estimates of serious road crash injury were produced. Results showed that there was a large amount of variation in the estimates of the number and profile of serious road crash injuries depending on the definition or measure used. The results also showed that as the definition of serious injury becomes more precise the vulnerable road users become more prominent. These results have major implications in terms of how serious injuries are identified for reporting purposes. Depending on the definitions used, the calculation of cost and understanding of the impact of serious injuries would vary greatly. This study has shown how data linkage can be used to investigate issues of data quality. It has also demonstrated the potential improvements to the understanding of the road safety problem, particularly serious injury, by conducting data linkage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of generalized estimating equations (GEEs) provides consistent estimates of the regression parameters in a marginal regression model for longitudinal data, even when the working correlation model is misspecified (Liang and Zeger, 1986). However, the efficiency of a GEE estimate can be seriously affected by the choice of the working correlation model. This study addresses this problem by proposing a hybrid method that combines multiple GEEs based on different working correlation models, using the empirical likelihood method (Qin and Lawless, 1994). Analyses show that this hybrid method is more efficient than a GEE using a misspecified working correlation model. Furthermore, if one of the working correlation structures correctly models the within-subject correlations, then this hybrid method provides the most efficient parameter estimates. In simulations, the hybrid method's finite-sample performance is superior to a GEE under any of the commonly used working correlation models and is almost fully efficient in all scenarios studied. The hybrid method is illustrated using data from a longitudinal study of the respiratory infection rates in 275 Indonesian children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Contemporary Finnish, spoken and written, reveals loanwords or foreignisms in the form of hybrids: a mixture of Finnish and foreign syllables (alumiinivalua). Sometimes loanwords are inserted into the Finnish sentence in their raw form just as they are found in the source language (pulp, after sales palvelu). Again, sometimes loanwords are calques, which appear Finnish but are spelled and pronounced in an altogether foreign manner (Protomanageri, Promenadi kampuksella). Research Questions What role does Finnish business translation play in the migration of foreignisms into Finnish if we consider translation "as a construct of solutions determined by the ideological constraints and conflicts characterizing the target culture" (Robyns 1992: 212)? What attitudes do the Finns display toward the presence of foreignisms in their language? What socio-economic or ideological conditions (Bassnett 1994: 321) are responsible for these attitudes? Are these conditions dynamic? What tools can be used to measure such attitudes? This dissertation set out to answer these and similar questions. Attitudes are imperialist (where otherness is both denied and transformed), defensive (where otherness is acknowledged, transformed, and vilified), transdiscursive (a neutral attitude to both otherness and transformation), or finally defective (where alien migration is acknowledged and "stimulated") (Robyns 1994: 60). Methodology The research method follows Rose's schema (1984: 8): (a) take an existing theory, (b) develop from it a proposition specific enough to be tested, (c) devise a scheme that tests this proposition, (d) carry through the scheme in practice, (e) draw up results and discuss conclusions in relation to the original theory. In other words, the method attempts an explanation of a Finnish social phenomenon based on systematic analyses of translated evidence (Lewins 1992: 4) whereby what really matters is the logical sequence that connects the empirical data to the initial research questions raised above and, ultimately to its conclusion (Yin 1984: 29). Results This research found that Finnish translators of the Nokia annual reports used a foreignism whenever possible such as komponentin instead of rakenneosa, or investoida instead of sijoittaa, and often without any apparent justification (Pryce 2003: 203-12) more than the translator's personal preference. In the old documents (minutes of meetings of the Board of Directors of Osakeyhtio H. Saastamoinen, Ltd. dated 5 July 1912-1917, a NOPSA booklet (1932), Enzo-Gutzeit-Tornator Oy document (1938), Imatra Steel Oy Annual Report 1964, and Nokia Oy Annual Report 1946), foreignisms under Haugen's (1950: 210-31) Classification #1 occurred an average of 0.6 times, while in the new documents (Nokia 1998 translated Annual Reports) they occurred an average of 6.5 times. That big difference, suggests transdiscursive and defective attitudes in Finnish society toward the other. In the 1850s, Finnish attitudes toward alien persons and cultures were hardened, intolerant and prohibitive because language politics were both nascent and emerging, and Finns adopted a defensive stance (Paloposki 2002: 102 ff) to protect their cultural and national treasures such as language and folklore. Innovation The innovation here is that no prior doctoral level research measured Finnish attitudes toward foreignisms using a business translation approach. This is the first time that Haugen's classification has been modified and applied in target language analysis. It is hoped that this method would be replicated in similar research in the future. Applications For practical applications, researchers with interest in languages, language development, language influences, language ideologies, and power structures that affect national language policies will find this thesis useful, especially the model for collecting, grouping, and analyzing foreignisms that has been demonstrated here. It is intended to document for posterity current attitudes of Finns toward the other as revealed in business translations from 1912-1964, and in 1998. This way, future language researchers would be able to explore a time-line of Finnish language development and attitudes toward the other. Communication firms may also find this research interesting. In future, could the model we adopted be used to analyze literary texts or religious texts for example? Future Trends Though business documents show transdiscursive attitudes, other segments of Finnish society may show defensive or imperialist attitudes. When the ideology of industrialization changes in the future, will Finnish attitudes toward the other change as well? Will it then be possible to use the same kind of analytical tools to measure Finnish attitudes? More broadly, will linguistic change continue in the same direction of transdiscursive attitudes, or will the change slow down or even reverse into xenophobic attitudes? Is this our model culture-specific or can it be used in the context of other cultures? Conclusion There is anger against foreignisms in Finland as newspaper publications and television broadcasts show, but research shows that a majority of Finns consider foreignisms and the languages from which they come as sources of enrichment for Finnish culture (Laitinen 2000, Eurobarometer series 41 of July 1994, 44 of Spring 1996, 50 of Autumn 1998). Ideologies of industrialization and globalization in Finland have facilitated transdiscursive tendencies. When Finland's political ideology was intolerant toward foreign influences in the 1850s because Finland was in the process of consolidating her nascent country and language, attitudes toward the importation of loanwords also became intolerant. Presently, when industrialization and globalization became the dominant ideologies, we see a shift in attitudes toward transdiscursive tendencies. Ideology is usually unseen and too often ignored by translation researchers. However, ideology reveals itself as the most powerful factor affecting language attitudes in a target culture. Key words Finnish, Business Translation, Ideology, Foreignisms, Imperialist Attitudes, Defensive Attitudes, Transdiscursive Attitudes, Defective Attitudes, the Other, Old Documents, New Documents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pharaoh cuttle Sepia pharaonis Ehrenberg, 1831 (Mollusca: Cephalopoda: Sepiida) is a broadly distributed species of substantial fisheries importance found from east Africa to southern Japan. Little is known about S. pharaonis phylogeography, but evidence from morphology and reproductive biology suggests that Sepia pharaonis is actually a complex of at least three species. To evaluate this possibility, we collected tissue samples from Sepia pharaonis from throughout its range. Phylogenetic analyses of partial mitochondrial 16S sequences from these samples reveal five distinct clades: a Gulf of Aden/Red Sea clade, a northern Australia clade, a Persian Gulf/Arabian Sea clade, a western Pacific clade (Gulf of Thailand and Taiwan) and an India/Andaman Sea clade. Phylogenetic analyses including several Sepia species show that S. pharaonis sensu lato may not be monophyletic. We suggest that "S. pharaonis" may consist of up to five species, but additional data will be required to fully clarify relationships within the S. pharaonis complex.