934 resultados para Single accelerator systems
Resumo:
In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.
Resumo:
Incidental findings on low-dose CT images obtained during hybrid imaging are an increasing phenomenon as CT technology advances. Understanding the diagnostic value of incidental findings along with the technical limitations is important when reporting image results and recommending follow-up, which may result in an additional radiation dose from further diagnostic imaging and an increase in patient anxiety. This study assessed lesions incidentally detected on CT images acquired for attenuation correction on two SPECT/CT systems. Methods: An anthropomorphic chest phantom containing simulated lesions of varying size and density was imaged on an Infinia Hawkeye 4 and a Symbia T6 using the low-dose CT settings applied for attenuation correction acquisitions in myocardial perfusion imaging. Twenty-two interpreters assessed 46 images from each SPECT/CT system (15 normal images and 31 abnormal images; 41 lesions). Data were evaluated using a jackknife alternative free-response receiver-operating-characteristic analysis (JAFROC). Results: JAFROC analysis showed a significant difference (P < 0.0001) in lesion detection, with the figures of merit being 0.599 (95% confidence interval, 0.568, 0.631) and 0.810 (95% confidence interval, 0.781, 0.839) for the Infinia Hawkeye 4 and Symbia T6, respectively. Lesion detection on the Infinia Hawkeye 4 was generally limited to larger, higher-density lesions. The Symbia T6 allowed improved detection rates for midsized lesions and some lower-density lesions. However, interpreters struggled to detect small (5 mm) lesions on both image sets, irrespective of density. Conclusion: Lesion detection is more reliable on low-dose CT images from the Symbia T6 than from the Infinia Hawkeye 4. This phantom-based study gives an indication of potential lesion detection in the clinical context as shown by two commonly used SPECT/CT systems, which may assist the clinician in determining whether further diagnostic imaging is justified.
Resumo:
In this work, we perform a first approach to emotion recognition from EEG single channel signals extracted in four (4) mother-child dyads experiment in developmental psychology -- Single channel EEG signals are analyzed and processed using several window sizes by performing a statistical analysis over features in the time and frequency domains -- Finally, a neural network obtained an average accuracy rate of 99% of classification in two emotional states such as happiness and sadness
Resumo:
Objective: In the setting of the increasing use of closed systems for reconstitution and preparation of these drugs, we intend to analyze the correct use of these systems in the Hospital Pharmacy, with the objective to minimize the risks of exposure not only for those professionals directly involved, but also for all the staff in the unit, taking also into account efficiency criteria. Method: Since some systems protect against aerosol formation but not from vapours, we decided to review which cytostatics should be prepared using an awl with an air inlet valve, in order to implement a new working procedure. We reviewed the formulations available in our hospital, with the following criteria: method of administration, excipients, and potential hazard for the staff handling them. We measured the diameters of the vials. We selected drugs with Level 1 Risk and also those including alcohol-based excipients, which could generate vapours. Outcomes: Out of the 66 reviewed formulations, we concluded that 11 drugs should be reconstituted with this type of awl: busulfan, cabazitaxel, carmustine, cyclophosphamide, eribulin, etoposide, fotemustine, melphalan, paclitaxel, temsirolimus and thiotepa; these represented an 18% of the total volume of formulations. Conclusions: The selection of healthcare products must be done at the Hospital Pharmacy, because the use of a system with an air valve inlet only for those drugs selected led to an outcome of savings and a more efficient use of materials. In our experience, we confirmed that the use of the needle could only be avoided when the awl could adapt to the different formulations of cytostatics, and this is only possible when different types of awls are available. Besides, connections were only really closed when a single awl was used for each vial. The change in working methodology when handling these drugs, as a result of this study, will allow us to start different studies about environmental contamination as a future line of work.
Resumo:
The study of quantum degenerate gases has many applications in topics such as condensed matter dynamics, precision measurements and quantum phase transitions. We built an apparatus to create 87Rb Bose-Einstein condensates (BECs) and generated, via optical and magnetic interactions, novel quantum systems in which we studied the contained phase transitions. For our first experiment we quenched multi-spin component BECs from a miscible to dynamically unstable immiscible state. The transition rapidly drives any spin fluctuations with a coherent growth process driving the formation of numerous spin polarized domains. At much longer times these domains coarsen as the system approaches equilibrium. For our second experiment we explored the magnetic phases present in a spin-1 spin-orbit coupled BEC and the contained quantum phase transitions. We observed ferromagnetic and unpolarized phases which are stabilized by the spin-orbit coupling’s explicit locking between spin and motion. These two phases are separated by a critical curve containing both first-order and second-order transitions joined at a critical point. The narrow first-order transition gives rise to long-lived metastable states. For our third experiment we prepared independent BECs in a double-well potential, with an artificial magnetic field between the BECs. We transitioned to a single BEC by lowering the barrier while expanding the region of artificial field to cover the resulting single BEC. We compared the vortex distribution nucleated via conventional dynamics to those produced by our procedure, showing our dynamical process populates vortices much more rapidly and in larger number than conventional nucleation.
Resumo:
With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.
Resumo:
Users need to be able to address in-air gesture systems, which means finding where to perform gestures and how to direct them towards the intended system. This is necessary for input to be sensed correctly and without unintentionally affecting other systems. This thesis investigates novel interaction techniques which allow users to address gesture systems properly, helping them find where and how to gesture. It also investigates audio, tactile and interactive light displays for multimodal gesture feedback; these can be used by gesture systems with limited output capabilities (like mobile phones and small household controls), allowing the interaction techniques to be used by a variety of device types. It investigates tactile and interactive light displays in greater detail, as these are not as well understood as audio displays. Experiments 1 and 2 explored tactile feedback for gesture systems, comparing an ultrasound haptic display to wearable tactile displays at different body locations and investigating feedback designs. These experiments found that tactile feedback improves the user experience of gesturing by reassuring users that their movements are being sensed. Experiment 3 investigated interactive light displays for gesture systems, finding this novel display type effective for giving feedback and presenting information. It also found that interactive light feedback is enhanced by audio and tactile feedback. These feedback modalities were then used alongside audio feedback in two interaction techniques for addressing gesture systems: sensor strength feedback and rhythmic gestures. Sensor strength feedback is multimodal feedback that tells users how well they can be sensed, encouraging them to find where to gesture through active exploration. Experiment 4 found that they can do this with 51mm accuracy, with combinations of audio and interactive light feedback leading to the best performance. Rhythmic gestures are continuously repeated gesture movements which can be used to direct input. Experiment 5 investigated the usability of this technique, finding that users can match rhythmic gestures well and with ease. Finally, these interaction techniques were combined, resulting in a new single interaction for addressing gesture systems. Using this interaction, users could direct their input with rhythmic gestures while using the sensor strength feedback to find a good location for addressing the system. Experiment 6 studied the effectiveness and usability of this technique, as well as the design space for combining the two types of feedback. It found that this interaction was successful, with users matching 99.9% of rhythmic gestures, with 80mm accuracy from target points. The findings show that gesture systems could successfully use this interaction technique to allow users to address them. Novel design recommendations for using rhythmic gestures and sensor strength feedback were created, informed by the experiment findings.
Resumo:
This thesis focuses on digital equalization of nonlinear fiber impairments for coherent optical transmission systems. Building from well-known physical models of signal propagation in single-mode optical fibers, novel nonlinear equalization techniques are proposed, numerically assessed and experimentally demonstrated. The structure of the proposed algorithms is strongly driven by the optimization of the performance versus complexity tradeoff, envisioning the near-future practical application in commercial real-time transceivers. The work is initially focused on the mitigation of intra-channel nonlinear impairments relying on the concept of digital backpropagation (DBP) associated with Volterra-based filtering. After a comprehensive analysis of the third-order Volterra kernel, a set of critical simplifications are identified, culminating in the development of reduced complexity nonlinear equalization algorithms formulated both in time and frequency domains. The implementation complexity of the proposed techniques is analytically described in terms of computational effort and processing latency, by determining the number of real multiplications per processed sample and the number of serial multiplications, respectively. The equalization performance is numerically and experimentally assessed through bit error rate (BER) measurements. Finally, the problem of inter-channel nonlinear compensation is addressed within the context of 400 Gb/s (400G) superchannels for long-haul and ultra-long-haul transmission. Different superchannel configurations and nonlinear equalization strategies are experimentally assessed, demonstrating that inter-subcarrier nonlinear equalization can provide an enhanced signal reach while requiring only marginal added complexity.
Resumo:
Small particles and their dynamics are of widespread interest due both to their unique properties and their ubiquity. Here, we investigate several classes of small particles: colloids, polymers, and liposomes. All these particles, due to their size on the order of microns, exhibit significant similarity in that they are large enough to be visualized in microscopes, but small enough to be significantly influenced by thermal (or Brownian) motion. Further, similar optical microscopy and experimental techniques are commonly employed to investigate all these particles. In this work, we develop single particle tracking techniques, which allow thorough characterization of individual particle dynamics, observing many behaviors which would be overlooked by methods which time or ensemble average. The various particle systems are also similar in that frequently, the signal-to-noise ratio represented a significant concern. In many cases, development of image analysis and particle tracking methods optimized to low signal-to-noise was critical to performing experimental observations. The simplest particles studied, in terms of their interaction potentials, were chemically homogeneous (though optically anisotropic) hard-sphere colloids. Using these spheres, we explored the comparatively underdeveloped conjunction of translation and rotation and particle hydrodynamics. Developing off this, the dynamics of clusters of spherical colloids were investigated, exploring how shape anisotropy influences the translation and rotation respectively. Transitioning away from uniform hard-sphere potentials, the interactions of amphiphilic colloidal particles were explored, observing the effects of hydrophilic and hydrophobic interactions upon pattern assembly and inter-particle dynamics. Interaction potentials were altered in a different fashion by working with suspensions of liposomes, which, while homogeneous, introduce the possibility of deformation. Even further degrees of freedom were introduced by observing the interaction of particles and then polymers within polymer suspensions or along lipid tubules. Throughout, while examination of the trajectories revealed that while by some measures, the averaged behaviors accorded with expectation, often closer examination made possible by single particle tracking revealed novel and unexpected phenomena.
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Hintergrund: Für die Therapie maligner Neubildungen stellt die Strahlentherapie wichtige Behandlungsmöglichkeiten dar, die sich in den vergangenen Jahrzehnten deutlich weiterentwickelt haben. Hierzu gehört unter anderem die stereotaktische Radiochirurgie (SRS), die durch eine einmalige Applikation fokussierter hoher Strahlendosen in einem klar definierten Zeitraum gekennzeichnet ist. Von besonderer Bedeutung ist die SRS für die Behandlung von Hirnmetastasen. Fragestellung: Ziel dieses HTA-Berichts ist die Erstellung einer umfassenden Übersicht der aktuellen Literatur der Behandlung von Hirnmetastasen, um die Radiochirurgie als alleinige Therapie oder in Kombination mit Therapiealternativen bezüglich der medizinischen Wirksamkeit, Sicherheit und Wirtschaftlichkeit sowie ethischer, sozialer und juristischer Aspekte zu vergleichen. Methodik: Relevante Publikationen deutscher und englischer Sprache werden über eine strukturierte Datenbank- sowie mittels Handrecherche zwischen Januar 2002 und August 2007 identifiziert. Die Zielpopulation bilden Patienten mit einer oder mehreren Hirnmetastasen. Eine Beurteilung der methodischen Qualität wird unter Beachtung von Kriterien der evidenzbasierten Medizin (EbM) durchgeführt. Ergebnisse: Von insgesamt 1.495 Treffern erfüllen 15 Studien die medizinischen Einschlusskriterien. Insgesamt ist die Studienqualität stark eingeschränkt und mit Ausnahme von zwei randomisierte kontrollierte Studien (RCT) und zwei Metaanalysen werden ausschließlich historische Kohortenstudien identifiziert. Die Untersuchung relevanter Endpunkte ist uneinheitlich. Qualitativ hochwertige Studien zeigen, dass die Ergänzung der Ganzhirnbestrahlung (WBRT) zur SRS sowie der SRS zur WBRT mit einer verbesserten lokalen Tumorkontrolle und Funktionsfähigkeit einhergeht. Nur im Vergleich zur alleinigen WBRT resultiert die Kombination von SRS und WBRT jedoch bei Patienten mit singulären Hirnmetastasen, RPA-Klasse 1 (RPA = Rekursive Partitionierungsanalyse) und bestimmten Primärtumoren in verbesserter Überlebenszeit. Die Therapiesicherheit zeigt in beiden Fällen keine deutlichen Unterschiede zwischen den Interventionsgruppen. Methodisch weniger hochwertige Studien finden keine eindeutigen Unterschiede zwischen SRS und WBRT, SRS und Neurochirurgie (NC) sowie SRS und hypofraktionierter Strahlentherapie (HCSRT). Die Lebensqualität wird in keiner Studie untersucht. Durch die Datenbankrecherche werden 320 Publikationen für den ökonomischen Bereich identifiziert. Insgesamt werden fünf davon für den vorliegenden Health Technology Assessment (HTA)-Bericht verwendet. Die Qualität der Publikationen ist dabei unterschiedlich. Bezüglich der Wirtschaftlichkeit verschiedener Gerätealternativen ergibt sich, unter der Annahme gleicher Wirksamkeit, eine starke Abhängigkeit von der Anzahl der behandelten Patienten. Im Fall, dass die beiden Gerätealternativen nur für die SRS verwandt werden, liegen Hinweise vor, dass das Gamma Knife kostengünstiger sein kann. Andernfalls ist es sehr wahrscheinlich, dass der flexiblere modifizierte Linearbeschleuniger kostengünstiger ist. Nach einem HTA sind die Gesamtkosten für ein Gamma Knife und einen dedizierten Linearbeschleuniger ungefähr gleich, während ein modifizierter Linearbeschleuniger günstiger ist. Für ethische, juristische und soziale Fragestellungen werden keine relevanten Publikationen identifiziert. Diskussion: Insgesamt sind sowohl die Qualität als auch die Quantität identifizierter Studien stark reduziert. Es zeigt sich jedoch, dass die Prognose von Patienten mit Hirnmetastasen auch unter modernsten therapeutischen Möglichkeiten schlecht ist. Ausreichend starke Evidenz gibt es lediglich für die Untersuchung ergänzender WBRT zur SRS und der ergänzenden SRS zur WBRT. Ein direkter Vergleich von SRS und WBRT, SRS und NC sowie SRS und HCSRT ist hingegen nicht möglich. Die Wirtschaftlichkeit verschiedener Gerätealternativen hängt von der Patientenzahl und den behandelten Indikationen ab. Für ausgelastete dedizierte Systeme, liegen Hinweise vor, dass sie kostengünstiger sein können. Bei flexibler Nutzung scheinen modifizierte Systeme wirtschaftlich vorteilhafter. Diese Aussagen erfolgen unter der nicht gesicherten Annahme gleicher Wirksamkeit der Alternativen. Die Behandlungspräzision der Geräte kann Einfluss auf die Gerätewahl haben. Zu neueren Gerätealternativen wie z. B. dem CyberKnife liegen bisher keine Untersuchungen vor. Aus der wirtschaftlich vorteilhaften hohen Auslastung folgt aber eine begrenzte Geräteanzahl in einem vorgegebenen Gebiet, was evtl. einen gleichberechtigten, wohnortnahen Zugang zu dieser Technik erschwert. Schlussfolgerungen: Die Kombination SRS und WBRT geht mit einer verbesserten lokalen Tumorkontrolle und Funktionsfähigkeit gegenüber der jeweils alleinigen Therapie einher. Nur für Patienten mit singulärer Metastase resultiert dies in Vorteilen der Überlebenszeit. Qualitativ hochwertige Studien sind notwendig um die SRS direkt mit WBRT und NC zu vergleichen. Weiterhin sollte besonders die Lebensqualität in zukünftigen Studien mitberücksichtigt werden. Bei der Art des verwendeten Gerätes zeichnet sich eine deutliche Abhängigkeit der Wirtschaftlichkeit der Geräte von der erreichbaren Auslastung ab. Hohe Patientenzahlen bieten Vorteile für spezialisierte Systeme und bei geringeren Patientenzahlen ist die Flexibilität modifizierter System vorteilhaft. Weitere Studien z. B. zum CyberKnife sind wünschenswert. Insgesamt ist die Studienlage insbesondere für das deutsche Gesundheitssystem sehr mangelhaft.
Resumo:
In the context of this work we evaluated a multisensory, noninvasive prototype platform for shake flask cultivations by monitoring three basic parameters (pH, pO2 and biomass). The focus lies on the evaluation of the biomass sensor based on backward light scattering. The application spectrum was expanded to four new organisms in addition to E. coli K12 and S. cerevisiae [1]. It could be shown that the sensor is appropriate for a wide range of standard microorganisms, e.g., L. zeae, K. pastoris, A. niger and CHO-K1. The biomass sensor signal could successfully be correlated and calibrated with well-known measurement methods like OD600, cell dry weight (CDW) and cell concentration. Logarithmic and Bleasdale-Nelder derived functions were adequate for data fitting. Measurements at low cell concentrations proved to be critical in terms of a high signal to noise ratio, but the integration of a custom made light shade in the shake flask improved these measurements significantly. This sensor based measurement method has a high potential to initiate a new generation of online bioprocess monitoring. Metabolic studies will particularly benefit from the multisensory data acquisition. The sensor is already used in labscale experiments for shake flask cultivations.
Resumo:
Part 17: Risk Analysis
Resumo:
Cardiovascular diseases (CVDs) have reached an epidemic proportion in the US and worldwide with serious consequences in terms of human suffering and economic impact. More than one third of American adults are suffering from CVDs. The total direct and indirect costs of CVDs are more than $500 billion per year. Therefore, there is an urgent need to develop noninvasive diagnostics methods, to design minimally invasive assist devices, and to develop economical and easy-to-use monitoring systems for cardiovascular diseases. In order to achieve these goals, it is necessary to gain a better understanding of the subsystems that constitute the cardiovascular system. The aorta is one of these subsystems whose role in cardiovascular functioning has been underestimated. Traditionally, the aorta and its branches have been viewed as resistive conduits connected to an active pump (left ventricle of the heart). However, this perception fails to explain many observed physiological results. My goal in this thesis is to demonstrate the subtle but important role of the aorta as a system, with focus on the wave dynamics in the aorta.
The operation of a healthy heart is based on an optimized balance between its pumping characteristics and the hemodynamics of the aorta and vascular branches. The delicate balance between the aorta and heart can be impaired due to aging, smoking, or disease. The heart generates pulsatile flow that produces pressure and flow waves as it enters into the compliant aorta. These aortic waves propagate and reflect from reflection sites (bifurcations and tapering). They can act constructively and assist the blood circulation. However, they may act destructively, promoting diseases or initiating sudden cardiac death. These waves also carry information about the diseases of the heart, vascular disease, and coupling of heart and aorta. In order to elucidate the role of the aorta as a dynamic system, the interplay between the dominant wave dynamic parameters is investigated in this study. These parameters are heart rate, aortic compliance (wave speed), and locations of reflection sites. Both computational and experimental approaches have been used in this research. In some cases, the results are further explained using theoretical models.
The main findings of this study are as follows: (i) developing a physiologically realistic outflow boundary condition for blood flow modeling in a compliant vasculature; (ii) demonstrating that pulse pressure as a single index cannot predict the true level of pulsatile workload on the left ventricle; (iii) proving that there is an optimum heart rate in which the pulsatile workload of the heart is minimized and that the optimum heart rate shifts to a higher value as aortic rigidity increases; (iv) introducing a simple bio-inspired device for correction and optimization of aortic wave reflection that reduces the workload on the heart; (v) deriving a non-dimensional number that can predict the optimum wave dynamic state in a mammalian cardiovascular system; (vi) demonstrating that waves can create a pumping effect in the aorta; (vii) introducing a system parameter and a new medical index, Intrinsic Frequency, that can be used for noninvasive diagnosis of heart and vascular diseases; and (viii) proposing a new medical hypothesis for sudden cardiac death in young athletes.
Resumo:
Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.
The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.
The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).
The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.
The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.
In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.