846 resultados para resistant data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mandatory data breach notification laws are a novel statutory solution in relation to organizational protections of personal information. They require organizations which have suffered a breach of security involving personal information to notif'y those persons whose information may have been affected. These laws originated in the state based legislatures of the United States during the last decade and have subsequently garnered worldwide legislative interest. Despite their perceived utility, mandatory data breach notification laws have several conceptual and practical concems that limit the scope of their applicability, particularly in relation to existing information privacy law regimes. We outline these concerns, and in doing so, we contend that while mandatory data breach notification laws have many useful facets, their utility as an 'add-on' to enhance the failings of current information privacy law frameworks should not necessarily be taken for granted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

longitudinal study of data modelling across grades 1-3. The activity engaged children in designing, implementing, and analysing a survey about their new playground. Data modelling involves investigations of meaningful phenomena, deciding what is worthy of attention (identifying complex attributes), and then progressing to organising, structuring, visualising, and representing data. The core components of data modelling addressed here are children’s structuring and representing of data, with a focus on their display of metarepresentational competence (diSessa, 2004). Such competence includes students’ abilities to invent or design a variety of new representations, explain their creations, understand the role they play, and critique and compare the adequacy of representations. Reported here are the ways in which the children structured and represented their data, the metarepresentational competence displayed, and links between their metarepresentational competence and conceptual competence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Cancer outlier profile analysis (COPA) has proven to be an effective approach to analyzing cancer expression data, leading to the discovery of the TMPRSS2 and ETS family gene fusion events in prostate cancer. However, the original COPA algorithm did not identify down-regulated outliers, and the currently available R package implementing the method is similarly restricted to the analysis of over-expressed outliers. Here we present a modified outlier detection method, mCOPA, which contains refinements to the outlier-detection algorithm, identifies both over- and under-expressed outliers, is freely available, and can be applied to any expression dataset. Results We compare our method to other feature-selection approaches, and demonstrate that mCOPA frequently selects more-informative features than do differential expression or variance-based feature selection approaches, and is able to recover observed clinical subtypes more consistently. We demonstrate the application of mCOPA to prostate cancer expression data, and explore the use of outliers in clustering, pathway analysis, and the identification of tumour suppressors. We analyse the under-expressed outliers to identify known and novel prostate cancer tumour suppressor genes, validating these against data in Oncomine and the Cancer Gene Index. We also demonstrate how a combination of outlier analysis and pathway analysis can identify molecular mechanisms disrupted in individual tumours. Conclusions We demonstrate that mCOPA offers advantages, compared to differential expression or variance, in selecting outlier features, and that the features so selected are better able to assign samples to clinically annotated subtypes. Further, we show that the biology explored by outlier analysis differs from that uncovered in differential expression or variance analysis. mCOPA is an important new tool for the exploration of cancer datasets and the discovery of new cancer subtypes, and can be combined with pathway and functional analysis approaches to discover mechanisms underpinning heterogeneity in cancers

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapeutic options for malignant pleural mesothelioma (MPM) are limited despite the increasing incidence globally. The vinca alkaloid vinorelbine exhibits clinical activity; however, to date, treatment optimization has not been achieved using biomarkers. BRCA1 regulates sensitivity to microtubule poisons; however, its role in regulating vinorelbine-induced apoptosis in mesothelioma is unknown. Here we demonstrate that BRCA1 plays an essential role in mediating vinorelbine-induced apoptosis, as evidenced by (1) the strong correlation between vinorelbine sensitivity and BRCA1 expression level; (2) induction of resistance to vinorelbine by BRCA1 using siRNA oligonucleotides; (3) dramatic down-regulation of BRCA1 following selection for vinorelbine resistance; and (4) the re-activation of vinorelbine-induced apoptosis following re-expression of BRCA1 in resistant cells. To determine whether loss of BRCA1 expression in mesothelioma was potentially relevant in vivo, BRCA1 immunohistochemistry was subsequently performed on 144 primary mesothelioma specimens. Loss of BRCA1 protein expression was identified in 38.9% of samples. Together, these data suggest that BRCA1 plays a critical role in mediating apoptosis by vinorelbine in mesothelioma, warranting its clinical evaluation as a predictive biomarker. Copyright © 2012 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Inherent and acquired cisplatin resistance reduces the effectiveness of this agent in the management of non-small cell lung cancer (NSCLC). Understanding the molecular mechanisms underlying this process may result in the development of novel agents to enhance the sensitivity of cisplatin. Methods: An isogenic model of cisplatin resistance was generated in a panel of NSCLC cell lines (A549, SKMES-1, MOR, H460). Over a period of twelve months, cisplatin resistant (CisR) cell lines were derived from original, age-matched parent cells (PT) and subsequently characterized. Proliferation (MTT) and clonogenic survival assays (crystal violet) were carried out between PT and CisR cells. Cellular response to cisplatin-induced apoptosis and cell cycle distribution were examined by FACS analysis. A panel of cancer stem cell and pluripotent markers was examined in addition to the EMT proteins, c-Met and β-catenin. Cisplatin-DNA adduct formation, DNA damage (γH2AX) and cellular platinum uptake (ICP-MS) was also assessed. Results: Characterisation studies demonstrated a decreased proliferative capacity of lung tumour cells in response to cisplatin, increased resistance to cisplatin-induced cell death, accumulation of resistant cells in the G0/G1 phase of the cell cycle and enhanced clonogenic survival ability. Moreover, resistant cells displayed a putative stem-like signature with increased expression of CD133+/CD44+cells and increased ALDH activity relative to their corresponding parental cells. The stem cell markers, Nanog, Oct-4 and SOX-2, were significantly upregulated as were the EMT markers, c-Met and β-catenin. While resistant sublines demonstrated decreased uptake of cisplatin in response to treatment, reduced cisplatin-GpG DNA adduct formation and significantly decreased γH2AX foci were observed compared to parental cell lines. Conclusion: Our results identified cisplatin resistant subpopulations of NSCLC cells with a putative stem-like signature, providing a further understanding of the cellular events associated with the cisplatin resistance phenotype in lung cancer. © 2013 Barr et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cancer stem-cell (CSC) hypothesis suggests that there is a small subset of cancer cells that are responsible for tumor initiation and growth, possessing properties such as indefinite self-renewal, slow replication, intrinsic resistance to chemotherapy and radiotherapy, and an ability to give rise to differentiated progeny. Through the use of xenotransplantation assays, putative CSCs have been identified in many cancers, often identified by markers usually expressed in normal stem cells. This is also the case in lung cancer, and the accumulated data on side population cells, CD133, CD166, CD44 and ALDH1 are beginning to clarify the true phenotype of the lung cancer stem cell. Furthermore, it is now clear that many of the pathways of normal stem cells, which guide cellular proliferation, differentiation, and apoptosis are also prominent in CSCs; the Hedgehog (Hh), Notch, and Wnt signaling pathways being notable examples. The CSC hypothesis suggests that there is a small reservoir of cells within the tumor, which are resistant to many standard therapies, and can give rise to new tumors in the form of metastases or relapses after apparent tumor regression. Therapeutic interventions that target CSC pathways are still in their infancy and clinical data of their efficacy remain limited. However Smoothened inhibitors, gamma-secretase inhibitors, anti-DLL4 antagonists, Wnt antagonists, and CBP/β-catenin inhibitors have all shown promising anticancer effects in early studies. The evidence to support the emerging picture of a lung cancer CSC phenotype and the development of novel therapeutic strategies to target CSCs are described in this review.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an input-orientated data envelopment analysis (DEA) framework which allows the measurement and decomposition of economic, environmental and ecological efficiency levels in agricultural production across different countries. Economic, environmental and ecological optimisations search for optimal input combinations that minimise total costs, total amount of nutrients, and total amount of cumulative exergy contained in inputs respectively. The application of the framework to an agricultural dataset of 30 OECD countries revealed that (i) there was significant scope to make their agricultural production systemsmore environmentally and ecologically sustainable; (ii) the improvement in the environmental and ecological sustainability could be achieved by being more technically efficient and, even more significantly, by changing the input combinations; (iii) the rankings of sustainability varied significantly across OECD countries within frontier-based environmental and ecological efficiency measures and between frontier-based measures and indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to forecast machinery health is vital to reducing maintenance costs, operation downtime and safety hazards. Recent advances in condition monitoring technologies have given rise to a number of prognostic models which attempt to forecast machinery health based on condition data such as vibration measurements. This paper demonstrates how the population characteristics and condition monitoring data (both complete and suspended) of historical items can be integrated for training an intelligent agent to predict asset health multiple steps ahead. The model consists of a feed-forward neural network whose training targets are asset survival probabilities estimated using a variation of the Kaplan–Meier estimator and a degradation-based failure probability density function estimator. The trained network is capable of estimating the future survival probabilities when a series of asset condition readings are inputted. The output survival probabilities collectively form an estimated survival curve. Pump data from a pulp and paper mill were used for model validation and comparison. The results indicate that the proposed model can predict more accurately as well as further ahead than similar models which neglect population characteristics and suspended data. This work presents a compelling concept for longer-range fault prognosis utilising available information more fully and accurately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The launch of the Centre of Research Excellence in Reducing Healthcare Associated Infection (CRE-RHAI) took place in Sydney on Friday 12 October 2012. The mission of the CRE-RHAI is to generate new knowledge about strategies to reduce healthcare associated infections and to provide data on the cost-effectiveness of infection control programs. As well as launching the CRE-RHAI, an important part of this event was a stakeholder Consultation Workshop, which brought together several experts in the Australian infection control community. The aims of this workshop were to establish the research and clinical priorities in Australian infection control, assess the importance of various multi-resistant organisms, and to gather information about decision making in infection control. We present here a summary and discussion of the responses we received.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our paper approaches Twitter through the lens of “platform politics” (Gillespie, 2010), focusing in particular on controversies around user data access, ownership, and control. We characterise different actors in the Twitter data ecosystem: private and institutional end users of Twitter, commercial data resellers such as Gnip and DataSift, data scientists, and finally Twitter, Inc. itself; and describe their conflicting interests. We furthermore study Twitter’s Terms of Service and application programming interface (API) as material instantiations of regulatory instruments used by the platform provider and argue for a more promotion of data rights and literacy to strengthen the position of end users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Vehicles are able to communicate on the local traffic state in real time, which could result in an automatic and therefore better reaction to the mechanism of traffic jam formation. An upstream single hop radio broadcast network can improve the perception of each cooperative driver within radio range and hence the traffic stability. The impact of a cooperative law on traffic congestion appearance is investigated, analytically and through simulation. Ngsim field data is used to calibrate the Optimal Velocity with Relative Velocity (OVRV) car following model and the MOBIL lane-changing model is implemented. Assuming that congestion can be triggered either by a perturbation in the instability domain or by a critical lane changing behavior, the calibrated car following behavior is used to assess the impact of a microscopic cooperative law on abnormal lane changing behavior. The cooperative law helps reduce and delay traffic congestion as it increases traffic flow stability.