935 resultados para Data access


Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: A registry mandated by the European Society of Cardiology collects data on trends in interventional cardiology within Europe. Special interest focuses on relative increases and ratios in new techniques and their distributions across Europe. We report the data through 2004 and give an overview of the development of coronary interventions since the first data collection in 1992. METHODS AND RESULTS: Questionnaires were distributed yearly to delegates of all national societies of cardiology represented in the European Society of Cardiology. The goal was to collect the case numbers of all local institutions and operators. The overall numbers of coronary angiographies increased from 1992 to 2004 from 684 000 to 2 238 000 (from 1250 to 3930 per million inhabitants). The respective numbers for percutaneous coronary interventions (PCIs) and coronary stenting procedures increased from 184 000 to 885 000 (from 335 to 1550) and from 3000 to 770 000 (from 5 to 1350), respectively. Germany was the most active country with 712 000 angiographies (8600), 249 000 angioplasties (3000), and 200 000 stenting procedures (2400) in 2004. The indication has shifted towards acute coronary syndromes, as demonstrated by rising rates of interventions for acute myocardial infarction over the last decade. The procedures are more readily performed and perceived safer, as shown by increasing rate of "ad hoc" PCIs and decreasing need for emergency coronary artery bypass grafting (CABG). In 2004, the use of drug-eluting stents continued to rise. However, an enormous variability is reported with the highest rate in Switzerland (70%). If the rate of progression remains constant until 2010 the projected number of coronary angiographies will be over three million, and the number of PCIs about 1.5 million with a stenting rate of almost 100%. CONCLUSION: Interventional cardiology in Europe is ever expanding. New coronary revascularization procedures, alternative or complementary to balloon angioplasty, have come and gone. Only stenting has stood the test of time and matured to the default technique. Facilitated access to PCI, more complete and earlier detection of coronary artery disease promise continued growth of the procedure despite the uncontested success of prevention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stashR package (a Set of Tools for Administering SHared Repositories) for R implements a simple key-value style database where character string keys are associated with data values. The key-value databases can be either stored locally on the user's computer or accessed remotely via the Internet. Methods specific to the stashR package allow users to share data repositories or access previously created remote data repositories. In particular, methods are available for the S4 classes localDB and remoteDB to insert, retrieve, or delete data from the database as well as to synchronize local copies of the data to the remote version of the database. Users efficiently access information from a remote database by retrieving only the data files indexed by user-specified keys and caching this data in a local copy of the remote database. The local and remote counterparts of the stashR package offer the potential to enhance reproducible research by allowing users of Sweave to cache their R computations for a research paper in a localDB database. This database can then be stored on the Internet as a remoteDB database. When readers of the research paper wish to reproduce the computations involved in creating a specific figure or calculating a specific numeric value, they can access the remoteDB database and obtain the R objects involved in the computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turrialba is one of the largest and most active stratovolcanoes in the Central Cordillera of Costa Rica and an excellent target for validation of satellite data using ground based measurements due to its high elevation, relative ease of access, and persistent elevated SO2 degassing. The Ozone Monitoring Instrument (OMI) aboard the Aura satellite makes daily global observations of atmospheric trace gases and it is used in this investigation to obtain volcanic SO2 retrievals in the Turrialba volcanic plume. We present and evaluate the relative accuracy of two OMI SO2 data analysis procedures, the automatic Band Residual Index (BRI) technique and the manual Normalized Cloud-mass (NCM) method. We find a linear correlation and good quantitative agreement between SO2 burdens derived from the BRI and NCM techniques, with an improved correlation when wet season data are excluded. We also present the first comparisons between volcanic SO2 emission rates obtained from ground-based mini-DOAS measurements at Turrialba and three new OMI SO2 data analysis techniques: the MODIS smoke estimation, OMI SO2 lifetime, and OMI SO2 transect techniques. A robust validation of OMI SO2 retrievals was made, with both qualitative and quantitative agreements under specific atmospheric conditions, proving the utility of satellite measurements for estimating accurate SO2 emission rates and monitoring passively degassing volcanoes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: We have shown that selective antegrade cerebral perfusion improves mid-term quality of life in patients undergoing surgical repair for acute type A aortic dissection and aortic aneurysms. The aim of the study was to assess the impact of continuous cerebral perfusion through the right subclavian artery on immediate outcome and quality of life. METHODS: Perioperative data of 567 consecutive patients who underwent surgery of the aortic arch using deep hypothermic circulatory arrest have been analyzed. Patients were divided into three groups, according to the management of cerebral protection. Three hundred eighty-seven patients (68.3%) had deep hypothermic circulatory arrest with pharmacologic protection with pentothal only, 91 (16.0%) had selective antegrade cerebral perfusion and pentothal, and 89 (15.7%) had continuous cerebral perfusion through the right subclavian artery and pentothal. All in-hospital data were assessed, and quality of life was analyzed prospectively 2.4 +/- 1.2 years after surgery with the Short Form-36 Health Survey Questionnaire. RESULTS: Major perioperative cerebrovascular injuries were observed in 1.1% of the patients with continuous cerebral perfusion through the right subclavian artery, compared with 9.8% with selective antegrade cerebral perfusion (p < 0.001) and 6.5% in the group with no antegrade cerebral perfusion (p = 0.007). Average quality of life after an arrest time between 30 and 50 minutes with continuous cerebral perfusion through the right subclavian artery was significantly better than selective antegrade cerebral perfusion (90.2 +/- 12.1 versus 74.4 +/- 40.7; p = 0.015). CONCLUSIONS: Continuous cerebral perfusion through the right subclavian artery improves considerably perioperative brain protection during deep hypothermic circulatory arrest. Irreversible perioperative neurologic complications can be significantly reduced and duration of deep hypothermic circulatory arrest can be extended up to 50 minutes without impairment in quality of life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is a case study of how Mwangalala community accesses water and how that access is maintained. Mwangalala community is located in the northern tip of Karonga district in Malawi, Africa. The case study evaluates how close the community is to meeting target 10 of the Millennium Development Goals, sustainable access to safe drinking water, and evaluates the current water system through Human Centered Design’s criteria of desirability, feasibility, and viability. It also makes recommendations to improve water security in Mwangalala community. Data was collected through two years of immersive observation, interviews with 30 families, and observing two wells on three separate occasions. The 30 interviews provided a sample size of over 10% of the community’s population. Participants were initially self-selected and then invited to participate in the research. I walked along community pathways and accepted invitations to join casual conversations in family compounds. After conversing I asked the family members if they would be willing to participate in my research by talking with me about water. Data collected from the interviews and the observations of two wells were compared and analyzed for common themes. Shallow wells or open wells represented the primary water source for 93% of interview participants. Boreholes were also present in the community, but produced unpalatable water due to high concentrations of dissolved iron and were not used as primary water sources. During observations 75% of community members who used the shallow well, primarily used for consumptive uses like cooking or dinking, were females. Boreholes were primarily used for non-consumptive uses such as watering crops or bathing and 77% of the users were male. Shallow wells could remain in disrepair for two months because the repairman was a volunteer, who was not compensated for the skilled labor required to repair the wells. Community members thought the maintenance fee went towards his salary, so did not compensate the repairman when he performed work. This miscommunication provided no incentive for the repairman to make well repairs a priority, and left community members frustrated with untimely repairs. Shallow wells with functional pumps failed to provide water when the water table levels drop during dry season, forcing community members to seek secondary or tertiary water sources. Open wells, converted from shallow wells after community members did not pay for repairs to the pump, represented 44% of the wells originally installed with Mark V hand pumps. These wells whose pumps were not repaired were located in fields and one beside a church. The functional wells were all located on school grounds or in family compounds, where responsibility for the well’s maintenance is clearly defined. Mwangalala community fails to meet Millennium Development goals because the wells used by the community do not provide sustainable access to safe drinking water. Open wells, used by half the participants in the study, lack a top covering to prevent contamination from debris and wildlife. Shallow well repair times are unsustainable, taking longer than two weeks to be repaired, primarily because the repair persons are expected to provide skilled labor to repair the wells without compensation. Improving water security for Mwangalala can be achieved by improving repair times on shallow wells and making water from boreholes palatable. There are no incentives for a volunteer repair person to fix wells in a timely manner. Repair times can be improved by reducing the number of wells a repair person is responsible for and compensating the person for the skilled labor provided. Water security would be further improved by removing iron particulates from borehole water, thus rendering it palatable. This is possible through point of use filtration utilizing ceramic candles; this would make pumped water available year-round.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use electronic communication networks for more than simply traditional telecommunications: we access the news, buy goods online, file our taxes, contribute to public debate, and more. As a result, a wider array of privacy interests is implicated for users of electronic communications networks and services. . This development calls into question the scope of electronic communications privacy rules. This paper analyses the scope of these rules, taking into account the rationale and the historic background of the European electronic communications privacy framework. We develop a framework for analysing the scope of electronic communications privacy rules using three approaches: (i) a service-centric approach, (ii) a data-centric approach, and (iii) a value-centric approach. We discuss the strengths and weaknesses of each approach. The current e-Privacy Directive contains a complex blend of the three approaches, which does not seem to be based on a thorough analysis of their strengths and weaknesses. The upcoming review of the directive announced by the European Commission provides an opportunity to improve the scoping of the rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Percutaneous closure of the transapical (TA) access site for large-calibre devices is an unsolved issue. We report the first experimental data on the TA PLUG device for true-percutaneous closure following large apical access for transcatheter aortic valve implantation. METHODS The TA PLUG, a self-sealing full-core closure device, was implanted in an acute animal study in six pigs (60.2 ± 0.7 kg). All the pigs received 100 IU/kg of heparin. The targeted activated clotting time was left to normalize spontaneously. After accessing the left ventricular apex with a 39 French introducer, the closure plug device was delivered with a 33 French over-the-wire system under fluoroscopic guidance into the apex. Time to full haemostasis as well as rate of bleeding was recorded. Self-anchoring properties were assessed by haemodynamic push stress under adrenalin challenge. An additional feasibility study was conducted in four pigs (58.4 ± 1.1 kg) with full surgical exposure of the apex, and assessed device anchoring by pull-force measurements with 0.5 Newton (N) increments. All the animals were electively sacrified. Post-mortem analysis of the heart was performed and the renal embolic index assessed. RESULTS Of six apical closure devices, five were correctly inserted and fully deployed at the first attempt. One became blocked in the delivery system and was placed successfully at the second attempt. In all the animals, complete haemostasis was immediate and no leak was recorded during the 5-h observation period. Neither leak nor any device dislodgement was observed under haemodynamic push stress with repeated left ventricular peak pressure of up to 220 mmHg. In the feasibility study assessing pull-stressing, device migration occurred at a force of 3.3 ± 0.5 N corresponding to 247.5 mmHg. Post-mortem analyses confirmed full expansion of all devices at the intended target. No macroscopic damage was identified at the surrounding myocardium. The renal embolic index was zero. CONCLUSIONS True-percutaneous left ventricular apex closure following large access is feasible with the self-sealing TA PLUG. The device allows for immediate haemostasis and a reliable anchoring in the acute animal setting. This is the first report of a true-percutaneous closure for large-calibre transcatheter aortic valve implantation access.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Statistical shape models are widely used in biomedical research. They are routinely implemented for automatic image segmentation or object identification in medical images. In these fields, however, the acquisition of the large training datasets, required to develop these models, is usually a time-consuming process. Even after this effort, the collections of datasets are often lost or mishandled resulting in replication of work. Objective: To solve these problems, the Virtual Skeleton Database (VSD) is proposed as a centralized storage system where the data necessary to build statistical shape models can be stored and shared. Methods: The VSD provides an online repository system tailored to the needs of the medical research community. The processing of the most common image file types, a statistical shape model framework, and an ontology-based search provide the generic tools to store, exchange, and retrieve digital medical datasets. The hosted data are accessible to the community, and collaborative research catalyzes their productivity. Results: To illustrate the need for an online repository for medical research, three exemplary projects of the VSD are presented: (1) an international collaboration to achieve improvement in cochlear surgery and implant optimization, (2) a population-based analysis of femoral fracture risk between genders, and (3) an online application developed for the evaluation and comparison of the segmentation of brain tumors. Conclusions: The VSD is a novel system for scientific collaboration for the medical image community with a data-centric concept and semantically driven search option for anatomical structures. The repository has been proven to be a useful tool for collaborative model building, as a resource for biomechanical population studies, or to enhance segmentation algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

HYPOTHESIS A previously developed image-guided robot system can safely drill a tunnel from the lateral mastoid surface, through the facial recess, to the middle ear, as a viable alternative to conventional mastoidectomy for cochlear electrode insertion. BACKGROUND Direct cochlear access (DCA) provides a minimally invasive tunnel from the lateral surface of the mastoid through the facial recess to the middle ear for cochlear electrode insertion. A safe and effective tunnel drilled through the narrow facial recess requires a highly accurate image-guided surgical system. Previous attempts have relied on patient-specific templates and robotic systems to guide drilling tools. In this study, we report on improvements made to an image-guided surgical robot system developed specifically for this purpose and the resulting accuracy achieved in vitro. MATERIALS AND METHODS The proposed image-guided robotic DCA procedure was carried out bilaterally on 4 whole head cadaver specimens. Specimens were implanted with titanium fiducial markers and imaged with cone-beam CT. A preoperative plan was created using a custom software package wherein relevant anatomical structures of the facial recess were segmented, and a drill trajectory targeting the round window was defined. Patient-to-image registration was performed with the custom robot system to reference the preoperative plan, and the DCA tunnel was drilled in 3 stages with progressively longer drill bits. The position of the drilled tunnel was defined as a line fitted to a point cloud of the segmented tunnel using principle component analysis (PCA function in MatLab). The accuracy of the DCA was then assessed by coregistering preoperative and postoperative image data and measuring the deviation of the drilled tunnel from the plan. The final step of electrode insertion was also performed through the DCA tunnel after manual removal of the promontory through the external auditory canal. RESULTS Drilling error was defined as the lateral deviation of the tool in the plane perpendicular to the drill axis (excluding depth error). Errors of 0.08 ± 0.05 mm and 0.15 ± 0.08 mm were measured on the lateral mastoid surface and at the target on the round window, respectively (n =8). Full electrode insertion was possible for 7 cases. In 1 case, the electrode was partially inserted with 1 contact pair external to the cochlea. CONCLUSION The purpose-built robot system was able to perform a safe and reliable DCA for cochlear implantation. The workflow implemented in this study mimics the envisioned clinical procedure showing the feasibility of future clinical implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain tumor is one of the most aggressive types of cancer in humans, with an estimated median survival time of 12 months and only 4% of the patients surviving more than 5 years after disease diagnosis. Until recently, brain tumor prognosis has been based only on clinical information such as tumor grade and patient age, but there are reports indicating that molecular profiling of gliomas can reveal subgroups of patients with distinct survival rates. We hypothesize that coupling molecular profiling of brain tumors with clinical information might improve predictions of patient survival time and, consequently, better guide future treatment decisions. In order to evaluate this hypothesis, the general goal of this research is to build models for survival prediction of glioma patients using DNA molecular profiles (U133 Affymetrix gene expression microarrays) along with clinical information. First, a predictive Random Forest model is built for binary outcomes (i.e. short vs. long-term survival) and a small subset of genes whose expression values can be used to predict survival time is selected. Following, a new statistical methodology is developed for predicting time-to-death outcomes using Bayesian ensemble trees. Due to a large heterogeneity observed within prognostic classes obtained by the Random Forest model, prediction can be improved by relating time-to-death with gene expression profile directly. We propose a Bayesian ensemble model for survival prediction which is appropriate for high-dimensional data such as gene expression data. Our approach is based on the ensemble "sum-of-trees" model which is flexible to incorporate additive and interaction effects between genes. We specify a fully Bayesian hierarchical approach and illustrate our methodology for the CPH, Weibull, and AFT survival models. We overcome the lack of conjugacy using a latent variable formulation to model the covariate effects which decreases computation time for model fitting. Also, our proposed models provides a model-free way to select important predictive prognostic markers based on controlling false discovery rates. We compare the performance of our methods with baseline reference survival methods and apply our methodology to an unpublished data set of brain tumor survival times and gene expression data, selecting genes potentially related to the development of the disease under study. A closing discussion compares results obtained by Random Forest and Bayesian ensemble methods under the biological/clinical perspectives and highlights the statistical advantages and disadvantages of the new methodology in the context of DNA microarray data analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current state of health and biomedicine includes an enormity of heterogeneous data ‘silos’, collected for different purposes and represented differently, that are presently impossible to share or analyze in toto. The greatest challenge for large-scale and meaningful analyses of health-related data is to achieve a uniform data representation for data extracted from heterogeneous source representations. Based upon an analysis and categorization of heterogeneities, a process for achieving comparable data content by using a uniform terminological representation is developed. This process addresses the types of representational heterogeneities that commonly arise in healthcare data integration problems. Specifically, this process uses a reference terminology, and associated "maps" to transform heterogeneous data to a standard representation for comparability and secondary use. The capture of quality and precision of the “maps” between local terms and reference terminology concepts enhances the meaning of the aggregated data, empowering end users with better-informed queries for subsequent analyses. A data integration case study in the domain of pediatric asthma illustrates the development and use of a reference terminology for creating comparable data from heterogeneous source representations. The contribution of this research is a generalized process for the integration of data from heterogeneous source representations, and this process can be applied and extended to other problems where heterogeneous data needs to be merged.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-throughput assays, such as yeast two-hybrid system, have generated a huge amount of protein-protein interaction (PPI) data in the past decade. This tremendously increases the need for developing reliable methods to systematically and automatically suggest protein functions and relationships between them. With the available PPI data, it is now possible to study the functions and relationships in the context of a large-scale network. To data, several network-based schemes have been provided to effectively annotate protein functions on a large scale. However, due to those inherent noises in high-throughput data generation, new methods and algorithms should be developed to increase the reliability of functional annotations. Previous work in a yeast PPI network (Samanta and Liang, 2003) has shown that the local connection topology, particularly for two proteins sharing an unusually large number of neighbors, can predict functional associations between proteins, and hence suggest their functions. One advantage of the work is that their algorithm is not sensitive to noises (false positives) in high-throughput PPI data. In this study, we improved their prediction scheme by developing a new algorithm and new methods which we applied on a human PPI network to make a genome-wide functional inference. We used the new algorithm to measure and reduce the influence of hub proteins on detecting functionally associated proteins. We used the annotations of the Gene Ontology (GO) and the Kyoto Encyclopedia of Genes and Genomes (KEGG) as independent and unbiased benchmarks to evaluate our algorithms and methods within the human PPI network. We showed that, compared with the previous work from Samanta and Liang, our algorithm and methods developed in this study improved the overall quality of functional inferences for human proteins. By applying the algorithms to the human PPI network, we obtained 4,233 significant functional associations among 1,754 proteins. Further comparisons of their KEGG and GO annotations allowed us to assign 466 KEGG pathway annotations to 274 proteins and 123 GO annotations to 114 proteins with estimated false discovery rates of <21% for KEGG and <30% for GO. We clustered 1,729 proteins by their functional associations and made pathway analysis to identify several subclusters that are highly enriched in certain signaling pathways. Particularly, we performed a detailed analysis on a subcluster enriched in the transforming growth factor β signaling pathway (P<10-50) which is important in cell proliferation and tumorigenesis. Analysis of another four subclusters also suggested potential new players in six signaling pathways worthy of further experimental investigations. Our study gives clear insight into the common neighbor-based prediction scheme and provides a reliable method for large-scale functional annotations in this post-genomic era.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.