902 resultados para data privacy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Big data is certainly the buzz term in executive networking circles at the moment. Heralded by management consultancies and research organisations alike as the next big thing in business efficiency, it is shooting up the Gartner hype cycle to the giddy heights of the peak of inflated expectations before it tumbles down in to the trough of disillusionment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate and detailed measurement of an individual's physical activity is a key requirement for helping researchers understand the relationship between physical activity and health. Accelerometers have become the method of choice for measuring physical activity due to their small size, low cost, convenience and their ability to provide objective information about physical activity. However, interpreting accelerometer data once it has been collected can be challenging. In this work, we applied machine learning algorithms to the task of physical activity recognition from triaxial accelerometer data. We employed a simple but effective approach of dividing the accelerometer data into short non-overlapping windows, converting each window into a feature vector, and treating each feature vector as an i.i.d training instance for a supervised learning algorithm. In addition, we improved on this simple approach with a multi-scale ensemble method that did not need to commit to a single window size and was able to leverage the fact that physical activities produced time series with repetitive patterns and discriminative features for physical activity occurred at different temporal scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Accelerometers have become one of the most common methods of measuring physical activity (PA). Thus, validity of accelerometer data reduction approaches remains an important research area. Yet, few studies directly compare data reduction approaches and other PA measures in free-living samples. Objective To compare PA estimates provided by 3 accelerometer data reduction approaches, steps, and 2 self-reported estimates: Crouter's 2-regression model, Crouter's refined 2-regression model, the weighted cut-point method adopted in the National Health and Nutrition Examination Survey (NHANES; 2003-2004 and 2005-2006 cycles), steps, IPAQ, and 7-day PA recall. Methods A worksite sample (N = 87) completed online-surveys and wore ActiGraph GT1M accelerometers and pedometers (SW-200) during waking hours for 7 consecutive days. Daily time spent in sedentary, light, moderate, and vigorous intensity activity and percentage of participants meeting PA recommendations were calculated and compared. Results Crouter's 2-regression (161.8 +/- 52.3 minutes/day) and refined 2-regression (137.6 +/- 40.3 minutes/day) models provided significantly higher estimates of moderate and vigorous PA and proportions of those meeting PA recommendations (91% and 92%, respectively) as compared with the NHANES weighted cut-point method (39.5 +/- 20.2 minutes/day, 18%). Differences between other measures were also significant. Conclusions When comparing 3 accelerometer cut-point methods, steps, and self-report measures, estimates of PA participation vary substantially.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Design process phases of development, evaluation and implementation were used to create a garment to simultaneously collect reliable data of speech production and intensity of movement of toddlers (18-36 months). A series of prototypes were developed and evaluated that housed accelerometer-based motion sensors and a digital transmitter with microphone. The approved test garment was a top constructed from loop-faced fabric with interior pockets to house devices. Extended side panels allowed for sizing. In total, 56 toddlers (28 male; 28 female; 16-36 months of age) participated in the study providing pilot and baseline data. The test garment was effective in collecting data as evaluated for accuracy and reliability using ANOVA for accelerometer data, transcription of video for type of movement, and number and length of utterances for speech production. The data collection garment has been implemented in various studies across disciplines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper firstly presents the benefits and critical challenges on the use of Bluetooth and Wi-Fi for crowd data collection and monitoring. The major challenges include antenna characteristics, environment’s complexity and scanning features. Wi-Fi and Bluetooth are compared in this paper in terms of architecture, discovery time, popularity of use and signal strength. Type of antennas used and the environment’s complexity such as trees for outdoor and partitions for indoor spaces highly affect the scanning range. The aforementioned challenges are empirically evaluated by “real” experiments using Bluetooth and Wi-Fi Scanners. The issues related to the antenna characteristics are also highlighted by experimenting with different antenna types. Novel scanning approaches including Overlapped Zones and Single Point Multi-Range detection methods will be then presented and verified by real-world tests. These novel techniques will be applied for location identification of the MAC IDs captured that can extract more information about people movement dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the outsourcing of income tax return preparation by Australian accounting firms. It identifies the extent to which firms are currently outsourcing accounting services or considering outsourcing accounting services, with a focus on personal and business income tax return preparation. The motivations and barriers for outsourcing by Australian accounting firms are also considered in this paper. Privacy, security of client data, and the competence of the outsourcing provider's staff have been identified as risks associated with outsourcing. An expectation relating to confidentiality of client data is also examined in this paper. Statistical analysis of data collected from a random sample of Australian accounting firms using a survey questionnaire provided the empirical data for the paper. The results indicate that the majority of Australian accounting firms are either currently outsourcing or considering outsourcing accounting services, and firms are outsourcing taxation preparation both onshore and offshore. The results also indicate that firms expect the volume of outsourced work to increase in the future. In contrast to the literature identifying labour arbitrage as the primary driver for organisations choosing to outsource, this study found that the main factors considered by accounting firms in the decision to outsource were to expedite delivery of services to clients and to enable the firm to focus on core competencies. Data from this study also supports the literature which ndicates that not all tax practitioners are adhering to codes of conduct in relation to client confidentiality. Research identifying the extent to which accounting services are outsourced is limited, therefore significant contributions to the academic literature and the accounting profession are provided by this ndicates that not all tax practitioners are adhering to codes of conduct in relation to client confidentiality. Research identifying the extent to which accounting services are outsourced is limited, therefore significant contributions to the academic literature and the accounting profession are provided by this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assurance of learning (AOL) is a quality enhancement and quality assurance process used in higher education. It involves a process of determining programme learning outcomes and standards, and systematically gathering evidence to measure students' performance on these. The systematic assessment of whole-of-programme outcomes provides a basis for curriculum development and management, continuous improvement, and accreditation. To better understand how AOL processes operate, a national study of university practices across one discipline area, business and management, was undertaken. To solicit data on AOL practice, interviews were undertaken with a sample of business school representatives (n = 25). Two key processes emerged: (1) mapping of graduate attributes and (2) collection of assurance data. External drivers such as professional accreditation and government legislation were the primary reasons for undertaking AOL outcomes but intrinsic motivators in relation to continuous improvement were also evident. The facilitation of academic commitment was achieved through an embedded approach to AOL by the majority of universities in the study. A sustainable and inclusive process of AOL was seen to support wider stakeholder engagement in the development of higher education learning outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mortality following hip arthroplasty is affected by a large number of confounding variables each of which must be considered to enable valid interpretation. Relevant variables available from the 2011 NJR data set were included in the Cox model. Mortality rates in hip arthroplasty patients were lower than in the age-matched population across all hip types. Age at surgery, ASA grade, diagnosis, gender, provider type, hip type and lead surgeon grade all had a significant effect on mortality. Schemper's statistic showed that only 18.98% of the variation in mortality was explained by the variables available in the NJR data set. It is inappropriate to use NJR data to study an outcome affected by a multitude of confounding variables when these cannot be adequately accounted for in the available data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One cannot help but be impressed by the inroads that digital oilfield technologies have made into the exploration and production (E&P) industry in the past decade. Today’s production systems can be monitored by “smart” sensors that allow engineers to observe almost any aspect of performance in real time. Our understanding of how reservoirs are behaving has improved considerably since the dawn of this revolution, and the industry has been able to move away from point answers to more holistic “big picture” integrated solutions. Indeed, the industry has already reaped the rewards of many of these kinds of investments. Many billions of dollars of value have been delivered by this heightened awareness of what is going on within our assets and the world around them (Van Den Berg et al. 2010).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic state estimation in an urban road network remains a challenge for traffic models and the question of how such a network performs remains a difficult one to answer for traffic operators. Lack of detailed traffic information has long restricted research in this area. The introduction of Bluetooth into the automotive world presented an alternative that has now developed to a stage where large-scale test-beds are becoming available, for traffic monitoring and model validation purposes. But how much confidence should we have in such data? This paper aims to give an overview of the usage of Bluetooth, primarily for the city-scale management of urban transport networks, and to encourage researchers and practitioners to take a more cautious look at what is currently understood as a mature technology for monitoring travellers in urban environments. We argue that the full value of this technology is yet to be realised, for the analytical accuracies peculiar to the data have still to be adequately resolved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the introduction of the Personally Controlled Health Record (PCEHR), the Australian public is being asked to accept greater responsibility for their healthcare by taking an active role in the management of personal health information. Although well designed, constructed and intentioned, policy and privacy concerns have resulted in an eHealth model that may impact future health sharing requirements. Hence, as a case study for a consumer eHealth initative in the Australian context, eHealth-as-a-Service (eHaaS) serves as a disruptive step in in the aggregation and transformation of health information for use as real-world knowledge. The strategic value of extending the community Health Record Bank (HRB) model lies in the ability to automatically draw on a multitude of relevant data repositories and sources to create a single source of the truth and to engage market forces to create financial sustainability. The opportunity to transform the beleaguered Australian PCEHR into a realisable and sustainable technology consumption model for patient safety is explored. Moreover, the current clerical focus of healthcare practitioners acting in the role of de facto record keepers is renegotiated to establish a shared knowledge creation landscape of action for safer patient interventions. To achieve this potential however requires a platform that will facilitate efficient and trusted unification of all health information available in real-time across the continuum of care. eHaaS provides a sustainable environment and encouragement to realise this potential.