900 resultados para Google Analytics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Materials, methods and systems are provided for the purifn., filtration and​/or sepn. of certain mols. such as certain size biomols. Certain embodiments relate to supports contg. at least one polymethacrylate polymer engineered to have certain pore diams. and other properties, and which can be functionally adapted to for certain purifications, filtrations and​/or sepns. Biomols. are selected from a group consisting of: polynucleotide mols., oligonucleotide mols. including antisense oligonucleotide mols. such as antisense RNA and other oligonucleotide mols. that are inhibitory of gene function such as small interfering RNA (siRNA)​, polypeptides including proteinaceous infective agents such as prions, for example, the infectious agent for CJD, and infectious agents such as viruses and phage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Definitive cisplatin-based is increasingly delivered as the treatment of choice for patients with head and neck cancer. Sensorineural hearing loss is a significant long term side effect of cisplatin-based chemoradiation and is associated with potential major quality of life issues for patients. Purpose The purpose of this manuscript was to review the mechanism behind sensorineural hearing loss in patients treated with cisplatin-based chemoradiation, including incidence, the contributions of radiotherapy and cisplatin to sensorineural hearing loss and the impact of the toxicity on patient quality of life. Methods Database searches were conducted through PubMed (National Centre for Biotechnology Information) and OvidSP Medline via the Queensland University of Technology Library website. General article searches were conducted through the online search engine Google Scholar. Articles were excluded if the full-text was unavailable, they were not in English or if they were published prior to 1990. Keywords included hearing loss, ototoxicity, cancer, quality of life, cisplatin and radiotherapy. Results/Discussion The total number of journal articles accessed was 290. Due to exclusion criteria, 129 articles were deemed appropriated for review. Findings indicated that sensorineural hearing loss is a significant, long term complication for patients treated with cisplatin-based chemoradiation. Current literature recognises the ototoxic effects of cisplatin and cranial irradiation as separate entities, however the impact of combined modality therapy on sensorineural hearing loss is seldom reported. Multiple risk factors for hearing loss are described, however there are contradictory opinions on incidence and severity and the exact radiation dose threshold responsible for inducing hearing loss in patients receiving combined modality therapy. Sensorineural hearing loss creates a subset of complexities for patients with head and neck cancer and that these patients face significant quality of life impairment. Conclusion The literature review identified that sensorineural hearing loss is a major quality of life issue for patients treated with cisplatin-based chemoradiation for head and neck cancer. Further investigation evaluating the contribution of cisplatin-based chemoradiation to sensorineural hearing loss and the subsequent effect on patient quality of life is warranted.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The concept of media influence has a long history in media and communication studies, and has also had significant influence on public policy. This article revisits questions of media influence through three short case studies. First, it critically analyses the strongly partisan position of News Corporation’s newspapers against the Labor government during the 2013 Australian Federal election to consider whether the potential for media influence equated to the effective use of media power. Second, it discusses the assumption in broadcasting legislation, in both the United Kingdom and Australia, that terrestrial broadcasting should be subject to more content regulation than subscription services, and notes the new challenges arising from digital television and over-the-top video streaming services. Finally, it discusses the rise of multi-platform global content aggregators such as Google, Apple, Microsoft and others, and how their rise necessitates changes in ways of thinking about concentration of media ownership, and regulations that may ensue from it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The upstream oil and gas industry has been contending with massive data sets and monolithic files for many years, but “Big Data” is a relatively new concept that has the potential to significantly re-shape the industry. Despite the impressive amount of value that is being realized by Big Data technologies in other parts of the marketplace, however, much of the data collected within the oil and gas sector tends to be discarded, ignored, or analyzed in a very cursory way. This viewpoint examines existing data management practices in the upstream oil and gas industry, and compares them to practices and philosophies that have emerged in organizations that are leading the way in Big Data. The comparison shows that, in companies that are widely considered to be leaders in Big Data analytics, data is regarded as a valuable asset—but this is usually not true within the oil and gas industry insofar as data is frequently regarded there as descriptive information about a physical asset rather than something that is valuable in and of itself. The paper then discusses how the industry could potentially extract more value from data, and concludes with a series of policy-related questions to this end.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ways in which technology mediates daily activities is shifting rapidly. Global trends point toward the uptake of ambient and interactive media to create radical new ways of working, interacting and socialising. Tech giants such as Google and Apple are banking on the success of this emerging market by investing in new future focused consumer products such as Google Glass and the Apple Watch. The potential implications of ubiquitous technological interactions via tangible and ambient media have never been more real or more accessible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many organizations realize that increasing amounts of data (“Big Data”) need to be dealt with intelligently in order to compete with other organizations in terms of efficiency, speed and services. The goal is not to collect as much data as possible, but to turn event data into valuable insights that can be used to improve business processes. However, data-oriented analysis approaches fail to relate event data to process models. At the same time, large organizations are generating piles of process models that are disconnected from the real processes and information systems. In this chapter we propose to manage large collections of process models and event data in an integrated manner. Observed and modeled behavior need to be continuously compared and aligned. This results in a “liquid” business process model collection, i.e. a collection of process models that is in sync with the actual organizational behavior. The collection should self-adapt to evolving organizational behavior and incorporate relevant execution data (e.g. process performance and resource utilization) extracted from the logs, thereby allowing insightful reports to be produced from factual organizational data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional text classification technology based on machine learning and data mining techniques has made a big progress. However, it is still a big problem on how to draw an exact decision boundary between relevant and irrelevant objects in binary classification due to much uncertainty produced in the process of the traditional algorithms. The proposed model CTTC (Centroid Training for Text Classification) aims to build an uncertainty boundary to absorb as many indeterminate objects as possible so as to elevate the certainty of the relevant and irrelevant groups through the centroid clustering and training process. The clustering starts from the two training subsets labelled as relevant or irrelevant respectively to create two principal centroid vectors by which all the training samples are further separated into three groups: POS, NEG and BND, with all the indeterminate objects absorbed into the uncertain decision boundary BND. Two pairs of centroid vectors are proposed to be trained and optimized through the subsequent iterative multi-learning process, all of which are proposed to collaboratively help predict the polarities of the incoming objects thereafter. For the assessment of the proposed model, F1 and Accuracy have been chosen as the key evaluation measures. We stress the F1 measure because it can display the overall performance improvement of the final classifier better than Accuracy. A large number of experiments have been completed using the proposed model on the Reuters Corpus Volume 1 (RCV1) which is important standard dataset in the field. The experiment results show that the proposed model has significantly improved the binary text classification performance in both F1 and Accuracy compared with three other influential baseline models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present invention relates generally to methods for diagnosing and treating infectious diseases and other conditions related thereto. More particularly, the present invention relates to methods for determining the presence of organisms of the Chlamydiaceae family in a subject, including species of Chlamydia, and to methods for determining the stage of an infection caused by such organisms. The present invention also relates to kits for use with the diagnostic methods. The methods and kits of the present invention are particularly useful in relation to human and non-human, i.e. veterinary subjects. The present invention further relates to methods for identifying proteins or nucleic acid sequences associated with chlamydial infection in a subject. Such proteins or nucleic acid sequences are not only useful in relation to the diagnostic methods of the invention but are also useful in the development of methods and agents for preventing and/or treating chlamydial infection in a subject, such as but not limited to, immunotherapeutic methods and agents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clinical Data Warehousing: A Business Analytic approach for managing health data

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although the external influence of scholars has usually been approximated by publication and citation count, the array of scholarly activities is far more extensive. Today, new technologies, in particular Internet search engines, allow more accurate measurement of scholars' influence on societal discourse. Hence, in this article, we analyse the relation between the internal and external influence of 723 top economists using the number of pages indexed by Google and Bing as a measure of external influence. We not only identify a small association between these scholars’ internal and external influence but also a correlation between internal influence, as captured by receipt of such major academic awards as the Nobel Prize and John Bates Clark Medal, and the external prominence of the top 100 researchers (JEL Code: A11, A13, Z18).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite much scholarly fascination with the question of whether great minds appear in cycles, together with some empirical evidence that historical cycles exist, prior studies mostly disregard the ‘‘great minds’’ hypothesis as it relates to scientists. Rather, researchers assume a linear relation based on the argument that science is allied with the development of technology. To probe this issue further, this study uses a ranking of over 5600 scientists based on number of appearances in Google Books over a period of 200 years (1800–2000). The results point to several peak periods, particularly for scientists born in the 1850–1859, 1897–1906, or 1900–1909 periods, suggesting overall cycles of around 8 years and a positive trend in distinction that lasts around 100 years. Nevertheless,a non-parametric test to determine whether randomness can be rejected indicates that nonrandomness is less apparent, although once we analyse the greatest minds overall, rejection is more likely.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on attrition has focused on the economic significance of low graduation rates in terms of costs to students (fees that do not culminate in a credential) and impact on future income. For a student who fails a unit and repeats the unit multiple times, the financial impact is significant and lasting (Bexley, Daroesman, Arkoudis & James 2013). There are obvious advantages for the timely completion of a degree, both for the student and the institution. Advantages to students include fee minimisation, enhanced engagement opportunities, effectual pathway to employment and a sense of worth, morale and cohort-identity benefits. Work undertaken by the QUT Analytics Project in 2013 and 2014 explored student engagement patterns capturing a variety of data sources and specifically, the use of LMS amongst students in 804 undergraduate units in one semester. Units with high failure rates were given further attention and it was found that students who were repeating a unit were less likely to pass the unit than students attempting it for the first time. In this repeating cohort, academic and behavioural variables were consistently more significant in the modelling than were any demographic variables, indicating that a student’s performance at university is far more impacted by what they do once they arrive than it is by where they come from. The aim of this poster session is to examine the findings and commonalities of a number of case studies that articulated the engagement activities of repeating students (which included collating data from Individual Unit Reports, academic and peer advising programs and engagement with virtual learning resources). Understanding the profile of the repeating student cohort is therefore as important as considering the characteristics of successful students so that the institution might be better placed to target the repeating students and make proactive interventions as early as possible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volcanic eruption centres of the mostly 4.5 Ma-5000 BP Newer Volcanics Province in the Hamilton area of southeastern Australia were examined in detail using a multifaceted approach, including ground truthing and analysis of ArcGIS Total Magnetic Intensity and seamless geology data, NASA Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) digital elevation models and Google Earth satellite image interpretation. Sixteen eruption centres were recognised in the Hamilton area, including three previously unrecorded volcanoes-one of which, the Cas Maar, constitutes the northernmost maar-cone volcanic complex in the Western Plains subprovince. Seven previously allocated eruption centres were placed into question based on field and laboratory observations. Three phases of volcanic activity have been suggested by other authors and are interpreted to correlate with ages of >4 Ma, ca 2 Ma and <0.5 Ma, which may be further subdivided based on preservation of outcrop. Geochemical compositions of the dominantly basaltic products become increasingly alkaline and enriched in incompatible elements from Phases 1 to 2, with Phase 3 eruptions both covering the entire geochemical range and extending into increasingly enriched compositions. This research highlights the importance of a multifaceted approach to landform mapping and demonstrates that additional volcanic centres may yet be discovered in the Newer Volcanics Province