854 resultados para Concept-based Retrieval


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There are limited data concerning endoscopist-directed endoscopic retrograde cholangiopancreatography deep sedation. The aim of this study was to establish the safety and risk factors for difficult sedation in daily practice. Patients and methods: Hospital-based, frequency matched case-control study. All patients were identified from a database of 1,008 patients between 2014 and 2015. The cases were those with difficult sedations. This concept was defined based on the combination of the receipt of high-doses of midazolam or propofol, poor tolerance, use of reversal agents or sedation-related adverse events. The presence of different factors was evaluated to determine whether they predicted difficult sedation. Results: One-hundred and eighty-nine patients (63 cases, 126 controls) were included. Cases were classified in terms of high-dose requirements (n = 35, 55.56%), sedation-related adverse events (n = 14, 22.22%), the use of reversal agents (n = 13, 20.63%) and agitation/discomfort (n = 8, 12.7%). Concerning adverse events, the total rate was 1.39%, including clinically relevant hypoxemia (n = 11), severe hypotension (n = 2) and paradoxical reactions to midazolam (n = 1). The rate of hypoxemia was higher in patients under propofol combined with midazolam than in patients with propofol alone (2.56% vs. 0.8%, p < 0.001). Alcohol consumption (OR: 2.674 [CI 95%: 1.098-6.515], p = 0.030), opioid consumption (OR: 2.713 [CI 95%: 1.096-6.716], p = 0.031) and the consumption of other psychoactive drugs (OR: 2.015 [CI 95%: 1.017-3.991], p = 0.045) were confirmed to be independent risk factors for difficult sedation. Conclusions: Endoscopist-directed deep sedation during endoscopic retrograde cholangiopancreatography is safe. The presence of certain factors should be assessed before the procedure to identify patients who are high-risk for difficult sedation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN] A new concept for fluid flow manipulation in microfluidic paper-based analytical devices (m-PADs) is presented by introducing ionogel materials as passive pumps. m-PADs were fabricated using a new doubleside contact stamping process and ionogels were precisely photopolymerised at the inlet of the m-PADs.The ionogels remain mainly on the surface of the paper and get absorbed in the superficial paper-fibers allowing for the liquid to flow from the ionogel into the paper easily. As a proof of concept the fluid flowand mixing behaviour of two different ionogels mPADs were compared with the non-treated mPADs.It was demonstrated that both ionogels highly affect the fluid flow by delaying the flow due to their different physical and chemical properties and water holding capacities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Document is Protected by copyright and was first published by Frontiers. All rights reserved. It is reproduced with permission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new generation of artificial satellites is providing a huge amount of Earth observation images whose exploitation can report invaluable benefits, both economical and environmental. However, only a small fraction of this data volume has been analyzed, mainly due to the large human resources needed for that task. In this sense, the development of unsupervised methodologies for the analysis of these images is a priority. In this work, a new unsupervised segmentation algorithm for satellite images is proposed. This algorithm is based on the rough-set theory, and it is inspired by a previous segmentation algorithm defined in the RGB color domain. The main contributions of the new algorithm are: (i) extending the original algorithm to four spectral bands; (ii) the concept of the superpixel is used in order to define the neighborhood similarity of a pixel adapted to the local characteristics of each image; (iii) and two new region merged strategies are proposed and evaluated in order to establish the final number of regions in the segmented image. The experimental results show that the proposed approach improves the results provided by the original method when both are applied to satellite images with different spectral and spatial resolutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past few years, human facial age estimation has drawn a lot of attention in the computer vision and pattern recognition communities because of its important applications in age-based image retrieval, security control and surveillance, biomet- rics, human-computer interaction (HCI) and social robotics. In connection with these investigations, estimating the age of a person from the numerical analysis of his/her face image is a relatively new topic. Also, in problems such as Image Classification the Deep Neural Networks have given the best results in some areas including age estimation. In this work we use three hand-crafted features as well as five deep features that can be obtained from pre-trained deep convolutional neural networks. We do a comparative study of the obtained age estimation results with these features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study explores the effects of modeling instruction on student learning in physics. Multiple representations grounded in physical contexts were employed by students to analyze the results of inquiry lab investigations. Class whiteboard discussions geared toward a class consensus following Socratic dialogue were implemented throughout the modeling cycle. Lab investigations designed to address student preconceptions related to Newton’s Third Law were implemented. Student achievement was measured based on normalized gains on the Force Concept Inventory. Normalized FCI gains achieved by students in this study were comparable to those achieved by students of other novice modelers. Physics students who had taken a modeling Intro to Physics course scored significantly higher on the FCI posttest than those who had not. The FCI results also provided insight into deeply rooted student preconceptions related to Newton’s Third Law. Implications for instruction and the design of lab investigations related to Newton’s Third Law are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hotembossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

SpicA FAR infrared Instrument, SAFARI, is one of the instruments planned for the SPICA mission. The SPICA mission is the next great leap forward in space-based far-infrared astronomy and will study the evolution of galaxies, stars and planetary systems. SPICA will utilize a deeply cooled 2.5m-class telescope, provided by European industry, to realize zodiacal background limited performance, and high spatial resolution. The instrument SAFARI is a cryogenic grating-based point source spectrometer working in the wavelength domain 34 to 230 μm, providing spectral resolving power from 300 to at least 2000. The instrument shall provide low and high resolution spectroscopy in four spectral bands. Low Resolution mode is the native instrument mode, while the high Resolution mode is achieved by means of a Martin-Pupplet interferometer. The optical system is all-reflective and consists of three main modules; an input optics module, followed by the Band and Mode Distributing Optics and the grating Modules. The instrument utilizes Nyquist sampled filled linear arrays of very sensitive TES detectors. The work presented in this paper describes the optical design architecture and design concept compatible with the current instrument performance and volume design drivers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Selling devices on retail stores comes with the big challenge of grabbing the customer’s attention. Nowadays people have a lot of offers at their disposal and new marketing techniques must emerge to differentiate the products. When it comes to smartphones and tablets, those devices can make the difference by themselves, if we use their computing power and capabilities to create something unique and interactive. With that in mind, three prototypes were developed during an internship: a face recognition based Customer Detection, a face tracking solution with an Avatar and interactive cross-app Guides. All three revealed to have potential to be differentiating solutions in a retail store, not only raising the chance of a customer taking notice of the device but also of interacting with them to learn more about their features. The results were meant to be only proof of concepts and therefore were not tested in the real world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to propose a theoretical framework, based on contemporary philosophical aesthetics, from which principled assessments of the aesthetic value of information organization frameworks may be conducted.Design/methodology/approach – This paper identifies appropriate discourses within the field of philosophical aesthetics, constructs from them a framework for assessing aesthetic properties of information organization frameworks. This framework is then applied in two case studies examining the Library of Congress Subject Headings (LCSH), and Sexual Nomenclature: A Thesaurus. Findings – In both information organization frameworks studied, the aesthetic analysis was useful in identifying judgments of the frameworks as aesthetic judgments, in promoting discovery of further areas of aesthetic judgments, and in prompting reflection on the nature of these aesthetic judgments. Research limitations/implications – This study provides proof-of-concept for the aesthetic evaluation of information organization frameworks. Areas of future research are identified as the role of cultural relativism in such aesthetic evaluation and identification of appropriate aesthetic properties of information organization frameworks.Practical implications – By identifying a subset of judgments of information organization frameworks as aesthetic judgments, aesthetic evaluation of such frameworks can be made explicit and principled. Aesthetic judgments can be separated from questions of economic feasibility, functional requirements, and user-orientation. Design and maintenance of information organization frameworks can be based on these principles.Originality/value – This study introduces a new evaluative axis for information organization frameworks based on philosophical aesthetics. By improving the evaluation of such novel frameworks, design and maintenance can be guided by these principles.Keywords Evaluation, Research methods, Analysis, Bibliographic systems, Indexes, Retrieval languages

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to propose a theoretical framework, based on contemporary philosophical aesthetics, from which principled assessments of the aesthetic value of information organization frameworks may be conducted.Design/methodology/approach – This paper identifies appropriate discourses within the field of philosophical aesthetics, constructs from them a framework for assessing aesthetic properties of information organization frameworks. This framework is then applied in two case studies examining the Library of Congress Subject Headings (LCSH), and Sexual Nomenclature: A Thesaurus. Findings – In both information organization frameworks studied, the aesthetic analysis was useful in identifying judgments of the frameworks as aesthetic judgments, in promoting discovery of further areas of aesthetic judgments, and in prompting reflection on the nature of these aesthetic judgments. Research limitations/implications – This study provides proof-of-concept for the aesthetic evaluation of information organization frameworks. Areas of future research are identified as the role of cultural relativism in such aesthetic evaluation and identification of appropriate aesthetic properties of information organization frameworks.Practical implications – By identifying a subset of judgments of information organization frameworks as aesthetic judgments, aesthetic evaluation of such frameworks can be made explicit and principled. Aesthetic judgments can be separated from questions of economic feasibility, functional requirements, and user-orientation. Design and maintenance of information organization frameworks can be based on these principles.Originality/value – This study introduces a new evaluative axis for information organization frameworks based on philosophical aesthetics. By improving the evaluation of such novel frameworks, design and maintenance can be guided by these principles.Keywords Evaluation, Analysis, Bibliographic systems, Indexes, Retrieval languages, Philosophy