435 resultados para analytics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The past decade has seen a rapid change in the climate system with an increased risk of extreme weather events. On and following the 3rd of January 2013, Tasmania experienced three catastrophic bushfires, which led to the evacuation of several communities, the loss of many properties, and a financial cost of approximately AUD$80 million. Objective To explore the impacts of the 2012/2013 Tasmanian bushfires on community pharmacies. Method Qualitative research methods were undertaken, employing semi-structured telephone interviews with a purposive sample of seven Tasmanian pharmacists. The interviews were recorded and transcribed, and two different methods were used to analyse the text. The first method utilised Leximancer® text analytics software to provide a birds-eye view of the conceptual structure of the text. The second method involved manual, open and axial coding, conducted independently by the two researchers for inter-rater reliability, to identify key themes in the discourse. Results Two main themes were identified - ‘people’ and ‘supply’ - from which six key concepts were derived. The six concepts were ‘patients’, ‘pharmacists’, ‘local doctor’, ‘pharmacy operations’, ‘disaster management planning’, and ‘emergency supply regulation’. Conclusion This study identified challenges faced by community pharmacists during Tasmanian bushfires. Interviewees highlighted the need for both the Tasmanian State Government and the Australian Federal Government to recognise the important primary care role that community pharmacists play during natural disasters, and therefore involve pharmacists in disaster management planning. They called for greater support and guidance for community pharmacists from regulatory and other government bodies during these events. Their comments highlighted the need for a review of Tasmania’s 3-day emergency supply regulation that allows pharmacists to provide a three-day supply of a patient’s medication without a doctor’s prescription in an emergency situation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present the results of an exploratory study that examined the problem of automating content analysis of student online discussion transcripts. We looked at the problem of coding discussion transcripts for the levels of cognitive presence, one of the three main constructs in the Community of Inquiry (CoI) model of distance education. Using Coh-Metrix and LIWC features, together with a set of custom features developed to capture discussion context, we developed a random forest classification system that achieved 70.3% classification accuracy and 0.63 Cohen's kappa, which is significantly higher than values reported in the previous studies. Besides improvement in classification accuracy, the developed system is also less sensitive to overfitting as it uses only 205 classification features, which is around 100 times less features than in similar systems based on bag-of-words features. We also provide an overview of the classification features most indicative of the different phases of cognitive presence that gives an additional insights into the nature of cognitive presence learning cycle. Overall, our results show great potential of the proposed approach, with an added benefit of providing further characterization of the cognitive presence coding scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Project archives are becoming increasingly large and complex. On construction projects in particular, the increasing amount of information and the increasing complexity of its structure make searching and exploring information in the project archive challenging and time-consuming. Methods This research investigates a query-driven approach that represents new forms of contextual information to help users understand the set of documents resulting from queries of construction project archives. Specifically, this research extends query-driven interface research by representing three types of contextual information: (1) the temporal context is represented in the form of a timeline to show when each document was created; (2) the search-relevance context shows exactly which of the entered keywords matched each document; and (3) the usage context shows which project participants have accessed or modified a file. Results We implemented and tested these ideas within a prototype query-driven interface we call VisArchive. VisArchive employs a combination of multi-scale and multi-dimensional timelines, color-coded stacked bar charts, additional supporting visual cues and filters to support searching and exploring historical project archives. The timeline-based interface integrates three interactive timelines as focus + context visualizations. Conclusions The feasibility of using these visual design principles is tested in two types of project archives: searching construction project archives of an educational building project and tracking of software defects in the Mozilla Thunderbird project. These case studies demonstrate the applicability, usefulness and generality of the design principles implemented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on the outcomes from a preliminary evaluation of technologies and processes intended to support the Assurance of Learning initiative in the business faculty of an Australian university. The study investigated how existing institutional information systems and operational processes could be used to support direct measures of student learning and the attainment of intended learning goals. The levels at which learning outcomes had been attained were extracted from the University Learning Management System (LMS), based on rubric data for three assessments in two units. Spreadsheets were used to link rubric criteria to the learning goals associated with the assessments as identified in a previous curriculum mapping exercise, and to aggregate the outcomes. Recommendations arising from this preliminary study are made to inform a more comprehensive pilot based on this approach, and manage the quality of student learning experiences in the context of existing processes and reporting structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Big Data and Learning Analytics’ promise to revolutionise educational institutions, endeavours, and actions through more and better data is now compelling. Multiple, and continually updating, data sets produce a new sense of ‘personalised learning’. A crucial attribute of the datafication, and subsequent profiling, of learner behaviour and engagement is the continual modification of the learning environment to induce greater levels of investment on the parts of each learner. The assumption is that more and better data, gathered faster and fed into ever-updating algorithms, provide more complete tools to understand, and therefore improve, learning experiences through adaptive personalisation. The argument in this paper is that Learning Personalisation names a new logistics of investment as the common ‘sense’ of the school, in which disciplinary education is ‘both disappearing and giving way to frightful continual training, to continual monitoring'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High quality of platelet analytics requires specialized knowledge and skills. It was applied to analyze platelet activation and aggregation responses in a prospective controlled study of patients with Finnish type of amyloidosis. The 20 patients with AGel amyloidosis displayed a delayed and more profound platelet shape change than healthy siblings and healthy volunteers, which may be related to altered fragmentation of mutated gelsolin during platelet activation. Alterations in platelet shape change have not been reported in association with platelet disorders. In the rare Bernard-Soulier syndrome with Asn45Ser mutation of glycoprotein (GP) IX, the diagnostic defect in the expression of GPIb-IX-V complex was characterized in seven Finnish patients, also an internationally exceptionally large patient series. When measuring thrombopoietin in serial samples of amniotic fluid and cord blood of 15 pregnant women with confirmed or suspected fetal alloimmune thrombocytopenia, the lower limit of detection could be extended. The results approved that thrombopoietin is present already in amniotic fluid. The application of various non-invasive means for diagnosing thrombocytopenia (TP) revealed that techniques for estimating the proportion of young, i.e. large platelets, such as direct measurement of reticulated platelets and the mean platelet size, would be useful for evaluating platelet kinetics in a given patient. Due to different kinetics between thrombopoietin and increase of young platelets in circulation, these measurements may have most predictive value when measured from simultaneous samples. Platelet autoantibodies were present not only in isolated autoimmune TP but also in patients without TP where disappearance of platelets might be compensated by increased production. The autoantibodies may also persist after TP has been cured. Simultaneous demonstration of increased young platelets (or increased mean platelet volume) in peripheral blood and the presence of platelet associated IgG specificities to major glycoproteins (GPIb-IX and GPIIb-IIIa) may be considered diagnostic for autoimmune TP. Measurement of a soluble marker as a sign of thrombin activation and proceeding deterioration of platelet components was applied to analyze the alterations under several stress factors (storage, transportation and lack of continuous shaking under controlled conditions) of platelet products. The GPV measured as a soluble factor in platelet storage medium showed good correlation with an array of other measurements commonly applied in characterization of stored platelets. The benefits of measuring soluble analyte in a quantitative assay were evident.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study focuses on the emergence of tuberculosis as a public health problem and the development of the various methods to counteract it in Finland before the introduction of efficient methods of treatment in the 1940s and 50s. It covers a time period from year 1882 when the tuberculosis bacterium was identified to the 1930s when the early formation of tuberculosis work became established in Finland. During this time there occurred important changes in medicine, public health thinking and methods of personal health care that have been referred to as the bacteriological revolution. The study places tuberculosis prevention in this context and shows how the tuberculosis problem affected the government of health on all these three dimensions. The study is based on foucauldian analytics of government, which is supplemented with perspectives from contemporary science and technology studies. In addition, it utilises a broad array of work in medical history. The central research materials consist of medical journals, official programs and documents on tuberculosis policy, and health education texts. The general conclusions of the study are twofold. Firstly, the ensemble of tuberculosis work was formed from historically diverse and often conflicting elements. The identification of the pathogen was only the first step in the establishment of tuberculosis as a major public health problem. Important were also the attention of the science of hygiene and statistical reasoning that dominated public health thinking in the late 19th century. Furthermore, the adoption of the bacteriological tuberculosis doctrine in medicine, public health work and health education was profoundly influenced by previous understanding of the nature of the illness, of medical work, of the prevention of contagious diseases, and of personal health care. Also the two central institutions of tuberculosis work, sanatorium and dispensary, have heterogeneous origins and multifarious functions. Secondly, bacteriology represented in this study by tuberculosis remodelled medical knowledge and practices, the targets and methods of public health policy, and the doctrine of personal health care. Tuberculosis provided a strong argument for specific causes (if not cures) as well as laboratory methods in medicine. Tuberculosis prevention contributed substantially to the development whereby a comprehensive responsibility for the health of the population and public health work was added to the agenda of the state. Health advice on tuberculosis and other contagious diseases used dangerous bacteria to motivate personal health care and redefined it as protecting oneself from the attacks of external pathogens and strengthening oneself against their effects. Thus, tuberculosis work is one important root for the contemporary public concern for the health of the population and the imperative of personal health care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The thesis aims at investigating the local dimension of the EU cohesion policy through the utilization of an alternative approach, which aims at the analysis of discourse and structures of power. The concrete case under analysis is the Interreg IV programme “Alpenrhein-Bodensee-Hochrhein”, which is conducted in the border region between Germany, Switzerland, Austria and the principality of Liechtenstein. The main research question is stated as such: What governmental rationalities can be found at work in the field of EU cross-border cooperation programmes? How is directive action and cooperation envisioned? How coherent are the different rationalities, which are found at work? The theoretical framework is based on a Foucaultian understanding of power and discourse and utilizes the notion of governmentalities as a way to de-stabilize the understanding of directive action and in order to highlight the dispersed and heterogeneous nature of governmental activity. The approach is situated within the general field of research on the European Union connected to basic conceptualisations such as the nature of power, the role of discourse and modes of subjectification. An approach termed “analytics of government”, based on the work of researchers like Mitchell Dean is introduced as the basic framework for the analysis. Four dimensions (visiblities, subjectivities, techniques/practices, problematisations) are presented as a set of tools with which governmental regimes of practices can be analysed. The empirical part of the thesis starts out with a discussion of the general framework of the European Union's cohesion policy and places the Interreg IV Alpenrhein-Bodensee-Hochrhein programme in this general context. The main analysis is based on eleven interviews which were conducted with different individuals, participating in the programme on different levels. The selection of interview partners aimed at maximising heterogeneity through including individuals from all parts of the programme region, obtaining different functions within the programme. The analysis reveals interesting aspects pertaining to the implementation and routine aspects of work within initiatives conducted under the heading of the EU cohesion policy. The central aspects of an Interreg IV Alpenrhein-Bodensee-Hochrhein – governmentality are sketched out. This includes a positive perception of the work atmosphere, administrative/professional characterisation of the selves and a de-politicization of the programme. Characteristic is the experience of tensions by interview partners and the use of discoursive strategies to resolve them. Negative perceptions play an important role for the specific governmental rationality. The thesis contributes to a better understanding of the local dimension of the European Union cohesion policy and questions established ways of thinking about governmental activity. It provides an insight into the working of power mechanisms in the constitution of fields of discourse and points out matters of practical importance as well as subsequent research questions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the major tasks in swarm intelligence is to design decentralized but homogenoeus strategies to enable controlling the behaviour of swarms of agents. It has been shown in the literature that the point of convergence and motion of a swarm of autonomous mobile agents can be controlled by using cyclic pursuit laws. In cyclic pursuit, there exists a predefined cyclic connection between agents and each agent pursues the next agent in the cycle. In this paper we generalize this idea to a case where an agent pursues a point which is the weighted average of the positions of the remaining agents. This point correspond to a particular pursuit sequence. Using this concept of centroidal cyclic pursuit, the behavior of the agents is analyzed such that, by suitably selecting the agents' gain, the rendezvous point of the agents can be controlled, directed linear motion of the agents can be achieved, and the trajectories of the agents can be changed by switching between the pursuit sequences keeping some of the behaviors of the agents invariant. Simulation experiments are given to support the analytical proofs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes a semi-automatic tool for annotation of multi-script text from natural scene images. To our knowledge, this is the maiden tool that deals with multi-script text or arbitrary orientation. The procedure involves manual seed selection followed by a region growing process to segment each word present in the image. The threshold for region growing can be varied by the user so as to ensure pixel-accurate character segmentation. The text present in the image is tagged word-by-word. A virtual keyboard interface has also been designed for entering the ground truth in ten Indic scripts, besides English. The keyboard interface can easily be generated for any script, thereby expanding the scope of the toolkit. Optionally, each segmented word can further be labeled into its constituent characters/symbols. Polygonal masks are used to split or merge the segmented words into valid characters/symbols. The ground truth is represented by a pixel-level segmented image and a '.txt' file that contains information about the number of words in the image, word bounding boxes, script and ground truth Unicode. The toolkit, developed using MATLAB, can be used to generate ground truth and annotation for any generic document image. Thus, it is useful for researchers in the document image processing community for evaluating the performance of document analysis and recognition techniques. The multi-script annotation toolokit (MAST) is available for free download.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a machine learning approach for subject independent human action recognition using depth camera, emphasizing the importance of depth in recognition of actions. The proposed approach uses the flow information of all 3 dimensions to classify an action. In our approach, we have obtained the 2-D optical flow and used it along with the depth image to obtain the depth flow (Z motion vectors). The obtained flow captures the dynamics of the actions in space time. Feature vectors are obtained by averaging the 3-D motion over a grid laid over the silhouette in a hierarchical fashion. These hierarchical fine to coarse windows capture the motion dynamics of the object at various scales. The extracted features are used to train a Meta-cognitive Radial Basis Function Network (McRBFN) that uses a Projection Based Learning (PBL) algorithm, referred to as PBL-McRBFN, henceforth. PBL-McRBFN begins with zero hidden neurons and builds the network based on the best human learning strategy, namely, self-regulated learning in a meta-cognitive environment. When a sample is used for learning, PBLMcRBFN uses the sample overlapping conditions, and a projection based learning algorithm to estimate the parameters of the network. The performance of PBL-McRBFN is compared to that of a Support Vector Machine (SVM) and Extreme Learning Machine (ELM) classifiers with representation of every person and action in the training and testing datasets. Performance study shows that PBL-McRBFN outperforms these classifiers in recognizing actions in 3-D. Further, a subject-independent study is conducted by leave-one-subject-out strategy and its generalization performance is tested. It is observed from the subject-independent study that McRBFN is capable of generalizing actions accurately. The performance of the proposed approach is benchmarked with Video Analytics Lab (VAL) dataset and Berkeley Multimodal Human Action Database (MHAD). (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a combination of technologies to provide an Energy-on-Demand (EoD) service to enable low cost innovation suitable for microgrid networks. The system is designed around the low cost and simple Rural Energy Device (RED) Box which in combination with Short Message Service (SMS) communication methodology serves as an elementary proxy for Smart meters which are typically used in urban settings. Further, customer behavior and familiarity in using such devices based on mobile experience has been incorporated into the design philosophy. Customers are incentivized to interact with the system thus providing valuable behavioral and usage data to the Utility Service Provider (USP). Data that is collected over time can be used by the USP for analytics envisioned by using remote computing services known as cloud computing service. Cloud computing allows for a sharing of computational resources at the virtual level across several networks. The customer-system interaction is facilitated by a third party Telecom Service provider (TSP). The approximate cost of the RED Box is envisaged to be under USD 10 on production scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Early diagnosis of disease is important, because therapeutic intervention is most successful before it spread to the subject. The best health screenings method could be the blood test because the blood contains thousands of bio-molecules coming as by-products from the diseased part of the organism and would be non-invasive approach. The major limitation of this approach is the very low concentrations of the analytes need to be detected. Raman spectroscopy has been proven as one of the cutting edge technique applied in the field of histology, cytology and clinical chemistry. The primary obstacle of Raman spectroscopy is the low signal intensities. One of the promising approaches to overcome that is surface enhanced Raman spectroscopy (SERS) which has opened novel opportunities for chemical and biomedical analytics. Albumin is one of the most abundant proteins in blood, produced by liver. The state of albumin in serum determines the health of the liver and kidney. Serum albumin helps to transport many small molecules such as fatty acids, bilirubin, calcium, drugs through the blood. In this study, SERS is being used for the quantification and to understand of binding mechanism serum albumin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In big data image/video analytics, we encounter the problem of learning an over-complete dictionary for sparse representation from a large training dataset, which cannot be processed at once because of storage and computational constraints. To tackle the problem of dictionary learning in such scenarios, we propose an algorithm that exploits the inherent clustered structure of the training data and make use of a divide-and-conquer approach. The fundamental idea behind the algorithm is to partition the training dataset into smaller clusters, and learn local dictionaries for each cluster. Subsequently, the local dictionaries are merged to form a global dictionary. Merging is done by solving another dictionary learning problem on the atoms of the locally trained dictionaries. This algorithm is referred to as the split-and-merge algorithm. We show that the proposed algorithm is efficient in its usage of memory and computational complexity, and performs on par with the standard learning strategy, which operates on the entire data at a time. As an application, we consider the problem of image denoising. We present a comparative analysis of our algorithm with the standard learning techniques that use the entire database at a time, in terms of training and denoising performance. We observe that the split-and-merge algorithm results in a remarkable reduction of training time, without significantly affecting the denoising performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Weebly is a freely-available software for creating Web pages without having to know HTML. It is easy to use, with its drag and drop editor, and offers the ability to add documents, Web links, videos, slideshows, audio, forms, polls, etc. It is hosted by Weebly and has no limits on storage space. Many templates are available for Web page design. One can publish and update almost immediately. Combined with usage of the freely-available Google Analytics, for example, it is possible to gather usage statistics. The site can be password protected, if need be. Weebly for Education is a special version for teachers and schools.