940 resultados para Data anonymization and sanitization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While trends are cyclical, Indigenous perspectives offer continuity to life’s pathways. One of the current trends is the increasing culinary interest in Indigenous Australian foods, not just in restaurants, but also in home kitchens. This is a recent trend despite Indigenous foods being nutritious and wholesome, and sustaining Indigenous peoples for thousands of years. Home Economics can support, foster and affirm Indigenous foods both within this current mainstream trend and in the future in life sustaining ways. In order to do so, Home Economics need’s to ensure it is prepared, and skilled, with the appropriate knowledge and regard for Indigenous ingredients, foods and foodways. This paper will focus on Torres Strait Islander foods from the Torres Strait and from mainland Australia. It will showcase Torres Strait foods is the past, present and the future. Some of what is presented here is part of a research case study, which involves a literature review, data collection, and photography. In documenting the history of Torres Strait Island food and foodways, the traditions and customs will be kept alive for future generations, and beyond any trends or fashions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Trans-Pacific Partnership is a sweeping trade agreement, spanning the Pacific Rim, and covering an array of topics, including intellectual property. There has been much analysis of the recently leaked intellectual property chapter of the Trans-Pacific Partnership by WikiLeaks. Julian Assange, WikiLeaks’ Editor-in-Chief, observed “The selective secrecy surrounding the TPP negotiations, which has let in a few cashed-up megacorps but excluded everyone else, reveals a telling fear of public scrutiny. By publishing this text we allow the public to engage in issues that will have such a fundamental impact on their lives.” Critical attention has focused upon the lack of transparency surrounding the agreement, copyright law and the digital economy; patent law, pharmaceutical drugs, and data protection; and the criminal procedures and penalties for trade secrets. The topic of trade mark law and related rights, such as internet domain names and geographical indications, deserves greater analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Papua New Guinea (PNG) is facing what must seem like an insurmountable challenge to deliver quality healthcare servicesfor women living in both rural and urban areas. Glo bal governing bodies and donor agencies including WHO and UN have indicated that PNG does not have an appropriate health information system. Although there are some systems in place, to date, little research has been conducted on improving or resolving the data integrity and integration issues of the existing health information systems and automating the capture of women and newborns information in PNG. This current research study concentrates on the adoption of eHealth, as an innovative tool to strengthen the health information systems in PNG to meet WHO standards. The research targets maternal and child health focussing on child birth records asan exemplar...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big data analysis in healthcare sector is still in its early stages when comparing with that of other business sectors due to numerous reasons. Accommodating the volume, velocity and variety of healthcare data Identifying platforms that examine data from multiple sources, such as clinical records, genomic data, financial systems, and administrative systems Electronic Health Record (EHR) is a key information resource for big data analysis and is also composed of varied co-created values. Successful integration and crossing of different subfields of healthcare data such as biomedical informatics and health informatics could lead to huge improvement for the end users of the health care system, i.e. the patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed systems are widely used for solving large-scale and data-intensive computing problems, including all-to-all comparison (ATAC) problems. However, when used for ATAC problems, existing computational frameworks such as Hadoop focus on load balancing for allocating comparison tasks, without careful consideration of data distribution and storage usage. While Hadoop-based solutions provide users with simplicity of implementation, their inherent MapReduce computing pattern does not match the ATAC pattern. This leads to load imbalances and poor data locality when Hadoop's data distribution strategy is used for ATAC problems. Here we present a data distribution strategy which considers data locality, load balancing and storage savings for ATAC computing problems in homogeneous distributed systems. A simulated annealing algorithm is developed for data distribution and task scheduling. Experimental results show a significant performance improvement for our approach over Hadoop-based solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Recovery strategies are often usedwith the intention of preventing orminimisingmuscle soreness after exercise. Whole-body cryotherapy, which involves a single or repeated exposure(s) to extremely cold dry air (below -100 °C) in a specialised chamber or cabin for two to four minutes per exposure, is currently being advocated as an effective intervention to reduce muscle soreness after exercise. Objectives To assess the effects (benefits and harms) of whole-body cryotherapy (extreme cold air exposure) for preventing and treating muscle soreness after exercise in adults. Search methods We searched the Cochrane Bone, Joint and Muscle Trauma Group Specialised Register, the Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL, the British Nursing Index and the Physiotherapy Evidence Database. We also searched the reference lists of articles, trial registers and conference proceedings, handsearched journals and contacted experts. The searches were run in August 2015. Selection criteria We aimed to include randomised and quasi-randomised trials that compared the use of whole-body cryotherapy (WBC) versus a passive or control intervention (rest, no treatment or placebo treatment) or active interventions including cold or contrast water immersion, active recovery and infrared therapy for preventing or treating muscle soreness after exercise in adults. We also aimed to include randomised trials that compared different durations or dosages of WBC. Our prespecified primary outcomes were muscle soreness, subjective recovery (e.g. tiredness, well-being) and adverse effects. Data collection and analysis Two review authors independently screened search results, selected studies, assessed risk of bias and extracted and cross-checked data. Where appropriate, we pooled results of comparable trials. The random-effects model was used for pooling where there was substantial heterogeneity.We assessed the quality of the evidence using GRADE. Main results Four laboratory-based randomised controlled trials were included. These reported results for 64 physically active predominantly young adults (mean age 23 years). All but four participants were male. Two trials were parallel group trials (44 participants) and two were cross-over trials (20 participants). The trials were heterogeneous, including the type, temperature, duration and frequency of WBC, and the type of preceding exercise. None of the trials reported active surveillance of predefined adverse events. All four trials had design features that carried a high risk of bias, potentially limiting the reliability of their findings. The evidence for all outcomes was classified as ’very low’ quality based on the GRADE criteria. Two comparisons were tested: WBC versus control (rest or no WBC), tested in four studies; and WBC versus far-infrared therapy, also tested in one study. No studies compared WBC with other active interventions, such as cold water immersion, or different types and applications of WBC. All four trials compared WBC with rest or no WBC. There was very low quality evidence for lower self-reported muscle soreness (pain at rest) scores after WBC at 1 hour (standardised mean difference (SMD) -0.77, 95% confidence interval (CI) -1.42 to -0.12; 20 participants, 2 cross-over trials); 24 hours (SMD -0.57, 95%CI -1.48 to 0.33) and 48 hours (SMD -0.58, 95% CI -1.37 to 0.21), both with 38 participants, 2 cross-over studies, 1 parallel group study; and 72 hours (SMD -0.65, 95% CI -2.54 to 1.24; 29 participants, 1 cross-over study, 1 parallel group study). Of note is that the 95% CIs also included either no between-group differences or a benefit in favour of the control group. One small cross-over trial (9 participants) found no difference in tiredness but better well-being after WBC at 24 hours post exercise. There was no report of adverse events. One small cross-over trial involving nine well-trained runners provided very low quality evidence of lower levels of muscle soreness after WBC, when compared with infrared therapy, at 1 hour follow-up, but not at 24 or 48 hours. The same trial found no difference in well-being but less tiredness after WBC at 24 hours post exercise. There was no report of adverse events. Authors’ conclusions There is insufficient evidence to determine whether whole-body cryotherapy (WBC) reduces self-reportedmuscle soreness, or improves subjective recovery, after exercise compared with passive rest or no WBC in physically active young adult males. There is no evidence on the use of this intervention in females or elite athletes. The lack of evidence on adverse events is important given that the exposure to extreme temperature presents a potential hazard. Further high-quality, well-reported research in this area is required and must provide detailed reporting of adverse events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, heterogeneity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers near equivalent answers compared with analyses of the full dataset under a controlled error rate. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally, it has the potential to add value to other Big Data sampling algorithms, in particular divide-and-conquer strategies, by determining efficient sub-samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The world has experienced a large increase in the amount of available data. Therefore, it requires better and more specialized tools for data storage and retrieval and information privacy. Recently Electronic Health Record (EHR) Systems have emerged to fulfill this need in health systems. They play an important role in medicine by granting access to information that can be used in medical diagnosis. Traditional systems have a focus on the storage and retrieval of this information, usually leaving issues related to privacy in the background. Doctors and patients may have different objectives when using an EHR system: patients try to restrict sensible information in their medical records to avoid misuse information while doctors want to see as much information as possible to ensure a correct diagnosis. One solution to this dilemma is the Accountable e-Health model, an access protocol model based in the Information Accountability Protocol. In this model patients are warned when doctors access their restricted data. They also enable a non-restrictive access for authenticated doctors. In this work we use FluxMED, an EHR system, and augment it with aspects of the Information Accountability Protocol to address these issues. The Implementation of the Information Accountability Framework (IAF) in FluxMED provides ways for both patients and physicians to have their privacy and access needs achieved. Issues related to storage and data security are secured by FluxMED, which contains mechanisms to ensure security and data integrity. The effort required to develop a platform for the management of medical information is mitigated by the FluxMED's workflow-based architecture: the system is flexible enough to allow the type and amount of information being altered without the need to change in your source code.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Bronchiectasis is a major contributor to chronic respiratory morbidity and mortality worldwide. Wheeze and other asthma-like symptoms and bronchial hyperreactivity may occur in people with bronchiectasis. Physicians often use asthma treatments in patients with bronchiectasis. OBJECTIVES To assess the effects of inhaled long-acting beta2-agonists (LABA) combined with inhaled corticosteroids (ICS) in children and adults with bronchiectasis during (1) acute exacerbations and (2) stable state. SEARCH METHODS The Cochrane Airways Group searched the the Cochrane Airways Group Specialised Register of Trials, which includes records identified from the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE and other databases. The Cochrane Airways Group performed the latest searches in October 2013. SELECTION CRITERIA All randomised controlled trials (RCTs) of combined ICS and LABA compared with a control (placebo, no treatment, ICS as monotherapy) in children and adults with bronchiectasis not related to cystic fibrosis (CF). DATA COLLECTION AND ANALYSIS Two review authors extracted data independently using standard methodological procedures as expected by The Cochrane Collaboration. MAIN RESULTS We found no RCTs comparing ICS and LABA combination with either placebo or usual care. We included one RCT that compared combined ICS and LABA with high-dose ICS in 40 adults with non-CF bronchiectasis without co-existent asthma. All participants received three months of high-dose budesonide dipropionate treatment (1600 micrograms). After three months, participants were randomly assigned to receive either high-dose budesonide dipropionate (1600 micrograms per day) or a combination of budesonide with formoterol (640 micrograms of budesonide and 18 micrograms of formoterol) for three months. The study was not blinded. We assessed it to be an RCT with overall high risk of bias. Data analysed in this review showed that those who received combined ICS-LABA (in stable state) had a significantly better transition dyspnoea index (mean difference (MD) 1.29, 95% confidence interval (CI) 0.40 to 2.18) and cough-free days (MD 12.30, 95% CI 2.38 to 22.2) compared with those receiving ICS after three months of treatment. No significant difference was noted between groups in quality of life (MD -4.57, 95% CI -12.38 to 3.24), number of hospitalisations (odds ratio (OR) 0.26, 95% CI 0.02 to 2.79) or lung function (forced expiratory volume in one second (FEV1) and forced vital capacity (FVC)). Investigators reported 37 adverse events in the ICS group versus 12 events in the ICS-LABA group but did not mention the number of individuals experiencing adverse events. Hence differences between groups were not included in the analyses. We assessed the overall evidence to be low quality. AUTHORS' CONCLUSIONS In adults with bronchiectasis without co-existent asthma, during stable state, a small single trial with a high risk of bias suggests that combined ICS-LABA may improve dyspnoea and increase cough-free days in comparison with high-dose ICS. No data are provided for or against, the use of combined ICS-LABA in adults with bronchiectasis during an acute exacerbation, or in children with bronchiectasis in a stable or acute state. The absence of high quality evidence means that decisions to use or discontinue combined ICS-LABA in people with bronchiectasis may need to take account of the presence or absence of co-existing airway hyper-responsiveness and consideration of adverse events associated with combined ICS-LABA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flos Chrysanthemum is a generic name for a particular group of edible plants, which also have medicinal properties. There are, in fact, twenty to thirty different cultivars, which are commonly used in beverages and for medicinal purposes. In this work, four Flos Chrysanthemum cultivars, Hangju, Taiju, Gongju, and Boju, were collected and chromatographic fingerprints were used to distinguish and assess these cultivars for quality control purposes. Chromatography fingerprints contain chemical information but also often have baseline drifts and peak shifts, which complicate data processing, and adaptive iteratively reweighted, penalized least squares, and correlation optimized warping were applied to correct the fingerprint peaks. The adjusted data were submitted to unsupervised and supervised pattern recognition methods. Principal component analysis was used to qualitatively differentiate the Flos Chrysanthemum cultivars. Partial least squares, continuum power regression, and K-nearest neighbors were used to predict the unknown samples. Finally, the elliptic joint confidence region method was used to evaluate the prediction ability of these models. The partial least squares and continuum power regression methods were shown to best represent the experimental results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Multilevel and spatial models are being increasingly used to obtain substantive information on area-level inequalities in cancer survival. Multilevel models assume independent geographical areas, whereas spatial models explicitly incorporate geographical correlation, often via a conditional autoregressive prior. However the relative merits of these methods for large population-based studies have not been explored. Using a case-study approach, we report on the implications of using multilevel and spatial survival models to study geographical inequalities in all-cause survival. Methods Multilevel discrete-time and Bayesian spatial survival models were used to study geographical inequalities in all-cause survival for a population-based colorectal cancer cohort of 22,727 cases aged 20–84 years diagnosed during 1997–2007 from Queensland, Australia. Results Both approaches were viable on this large dataset, and produced similar estimates of the fixed effects. After adding area-level covariates, the between-area variability in survival using multilevel discrete-time models was no longer significant. Spatial inequalities in survival were also markedly reduced after adjusting for aggregated area-level covariates. Only the multilevel approach however, provided an estimation of the contribution of geographical variation to the total variation in survival between individual patients. Conclusions With little difference observed between the two approaches in the estimation of fixed effects, multilevel models should be favored if there is a clear hierarchical data structure and measuring the independent impact of individual- and area-level effects on survival differences is of primary interest. Bayesian spatial analyses may be preferred if spatial correlation between areas is important and if the priority is to assess small-area variations in survival and map spatial patterns. Both approaches can be readily fitted to geographically enabled survival data from international settings

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.