970 resultados para synchroton-based techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reuse distance analysis, the prediction of how many distinct memory addresses will be accessed between two accesses to a given address, has been established as a useful technique in profile-based compiler optimization, but the cost of collecting the memory reuse profile has been prohibitive for some applications. In this report, we propose using the hardware monitoring facilities available in existing CPUs to gather an approximate reuse distance profile. The difficulties associated with this monitoring technique are discussed, most importantly that there is no obvious link between the reuse profile produced by hardware monitoring and the actual reuse behavior. Potential applications which would be made viable by a reliable hardware-based reuse distance analysis are identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The binding of immune inhibitory receptor Programmed Death 1 (PD-1) on T cells to its ligand PD-L1 has been implicated as a major contributor to tumor induced immune suppression. Clinical trials of PD-L1 blockade have proven effective in unleashing therapeutic anti-tumor immune responses in a subset of patients with advanced melanoma, yet current response rates are low for reasons that remain unclear. Hypothesizing that the PD-1/PD-L1 pathway regulates T cell surveillance within the tumor microenvironment, we employed intravital microscopy to investigate the in vivo impact of PD-L1 blocking antibody upon tumor-associated immune cell migration. However, current analytical methods of intravital dynamic microscopy data lack the ability to identify cellular targets of T cell interactions in vivo, a crucial means for discovering which interactions are modulated by therapeutic intervention. By developing novel imaging techniques that allowed us to better analyze tumor progression and T cell dynamics in the microenvironment; we were able to explore the impact of PD-L1 blockade upon the migratory properties of tumor-associated immune cells, including T cells and antigen presenting cells, in lung tumor progression. Our results demonstrate that early changes in tumor morphology may be indicative of responsiveness to anti-PD-L1 therapy. We show that immune cells in the tumor microenvironment as well as tumors themselves express PD-L1, but immune phenotype alone is not a predictive marker of effective anti-tumor responses. Through a novel method in which we quantify T cell interactions, we show that T cells are largely engaged in interactions with dendritic cells in the tumor microenvironment. Additionally, we show that during PD-L1 blockade, non-activated T cells are recruited in greater numbers into the tumor microenvironment and engage more preferentially with dendritic cells. We further show that during PD-L1 blockade, activated T cells engage in more confined, immune synapse-like interactions with dendritic cells, as opposed to more dynamic, kinapse-like interactions with dendritic cells when PD-L1 is free to bind its receptor. By advancing the contextual analysis of anti-tumor immune surveillance in vivo, this study implicates the interaction between T cells and tumor-associated dendritic cells as a possible modulator in targeting PD-L1 for anti-tumor immunotherapy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Una de las principales causas del ruido en nuestras ciudades es el tráfico rodado. El ruido generado por los vehículos no es sólo debido al motor, sino que existen diversas fuentes de ruido en los mismos, entre las que se puede destacar el ruido de rodadura. Para localizar las causas del ruido e identificar las principales fuentes del mismo se han utilizado en diversos estudios las técnicas de coherencia y las técnicas basadas en arrays. Sin embargo, en la bibliografía existente, no es habitual encontrar el uso de estas técnicas en el sector automovilístico. En esta tesis se parte de la premisa de la posibilidad de usar estas técnicas de medida en coches, para demostrar a la largo de la misma su factibilidad y su bondad para evaluar las fuentes de ruido en dos condiciones distintas: cuando el coche está parado y cuando está en movimiento. Como técnica de coherencia se elige la de Intensidad Selectiva, utilizándose la misma para evaluar la coherencia existente entre el ruido que llega a los oídos del conductor y la intensidad radiada por distintos puntos del motor. Para la localización de fuentes de ruido, las técnicas basadas en array son las que mejores resultados ofrecen. Statistically Optimized Near-field Acoustical Holography (SONAH) es la técnica elegida para la localización y caracterización de las fuentes de ruido en el motor a baja frecuencia. En cambio, Beamforming es la técnica seleccionada para el caso de media-alta frecuencia y para la evaluación de las fuentes de ruido cuando el coche se encuentra en movimiento. Las técnicas propuestas no sólo pueden utilizarse en medidas reales, sino que además proporcionan abundante información y frecen una gran versatilidad a la hora de caracterizar fuentes de ruido. ABSTRACT One of the most important noise causes in our cities is the traffic. The noise generated by the vehicles is not only due to the engine, but there are some other noise sources. Among them the tyre/road noise can be highlighted. Coherence and array based techniques have been used in some research to locate the noise causes and identify the main noise sources. Nevertheless, it is not usual in the literature to find the application of this kind of techniques in the car sector. This Thesis starts taking into account the possibility of using this kind of measurement techniques in cars, to demonstrate their feasability and their quality to evaluate the noise sources under two different conditions: when the car is stopped and when it is in movement. Selective Intensity was chosen as coherence technique, evaluating the coherence between the noise in the driver’s ears and the intensity radiated in different points of the engine. Array based techniques carry out the best results to noise source location. Statistically Optimized Near-field Acoustical Holography (SONAH) is the measurement technique chosen for noise source location and characterization in the engine at low frequency. On the other hand, Beamforming is the technique chosen in the case of medium-high frequency and to characterize the noise sources when the car is in movement. The proposed techniques not only can be used in actual measurements, but also provide a lot of information and are very versatile to noise source characterization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of the sample introduction system on the signals obtained with different tin compounds in inductively coupled plasma (ICP) based techniques, i.e., ICP atomic emission spectrometry (ICP–AES) and ICP mass spectrometry (ICP–MS) has been studied. Signals for test solutions prepared from four different tin compounds (i.e., tin tetrachloride, monobutyltin, dibutyltin and di-tert-butyltin) in different solvents (methanol 0.8% (w/w), i-propanol 0.8% (w/w) and various acid matrices) have been measured by ICP–AES and ICP–MS. The results demonstrate a noticeable influence of the volatility of the tin compounds on their signals measured with both techniques. Thus, in agreement with the compound volatility, the highest signals are obtained for tin tetrachloride followed by di-tert-butyltin/monobutyltin and dibutyltin. The sample introduction system exerts an important effect on the amount of solution loading the plasma and, hence, on the relative signals afforded by the tin compounds in ICP–based techniques. Thus, when working with a pneumatic concentric nebulizer, the use of spray chambers affording high solvent transport efficiency to the plasma (such as cyclonic and single pass) or high spray chamber temperatures is recommended to minimize the influence of the tin chemical compound. Nevertheless, even when using the conventional pneumatic nebulizer coupled to the best spray chamber design (i.e., a single pass spray chamber), signals obtained for di-tert-butyltin/monobutyltin and dibutyltin are still around 10% and 30% lower than the corresponding signal for tin tetrachloride, respectively. When operating with a pneumatic microconcentric nebulizer coupled to a 50 °C-thermostated cinnabar spray chamber, all studied organotin compounds provided similar emission signals although about 60% lower than those obtained for tin tetrachloride. The use of an ultrasonic nebulizer coupled to a desolvation device provides the largest differences in the emission signals, among all tested systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The elemental analysis of Spanish palm dates by inductively coupled plasma atomic emission spectrometry and inductively coupled plasma mass spectrometry is reported for the first time. To complete the information about the mineral composition of the samples, C, H, and N are determined by elemental analysis. Dates from Israel, Tunisia, Saudi Arabia, Algeria and Iran have also been analyzed. The elemental composition have been used in multivariate statistical analysis to discriminate the dates according to its geographical origin. A total of 23 elements (As, Ba, C, Ca, Cd, Co, Cr, Cu, Fe, H, In, K, Li, Mg, Mn, N, Na, Ni, Pb, Se, Sr, V, and Zn) at concentrations from major to ultra-trace levels have been determined in 13 date samples (flesh and seeds). A careful inspection of the results indicate that Spanish samples show higher concentrations of Cd, Co, Cr, and Ni than the remaining ones. Multivariate statistical analysis of the obtained results, both in flesh and seed, indicate that the proposed approach can be successfully applied to discriminate the Spanish date samples from the rest of the samples tested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this chapter we present the relevant mathematical background to address two well defined signal and image processing problems. Namely, the problem of structured noise filtering and the problem of interpolation of missing data. The former is addressed by recourse to oblique projection based techniques whilst the latter, which can be considered equivalent to impulsive noise filtering, is tackled by appropriate interpolation methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables full spectrum CT in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical eects in the detector and are very noisy due to photon starvation. In this work, we proposed two methods based on machine learning to address the spectral distortion issue and to improve the material decomposition. This rst approach is to model distortions using an articial neural network (ANN) and compensate for the distortion in a statistical reconstruction. The second approach is to directly correct for the distortion in the projections. Both technique can be done as a calibration process where the neural network can be trained using 3D printed phantoms data to learn the distortion model or the correction model of the spectral distortion. This replaces the need for synchrotron measurements required in conventional technique to derive the distortion model parametrically which could be costly and time consuming. The results demonstrate experimental feasibility and potential advantages of ANN-based distortion modeling and correction for more accurate K-edge imaging with a PCXD. Given the computational eciency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polymer solar cells are promising in that they are inexpensive to produce, and due to their mechanical flexibility have the potential for use in applications not possible for more traditional types of solar cells. The performance of polymer solar cells depends strongly on the distribution of electron donor and acceptor material in the active layer. Understanding the connection between morphology and performance as well as how to control the morphology, is therefore of great importance. Furthermore, improving the lifetime of polymer solar cells has become at least as important as improving the efficiency.   In this thesis, the relation between morphology and solar cell performance is studied, and the material stability for blend films of the thiophene-quinoxaline copolymer TQ1 and the fullerene derivatives PCBM and PC70BM. Atomic force microscopy (AFM) and scanning transmission X-ray microscopy (STXM) are used to investigate the lateral morphology, secondary ion mass spectrometry (SIMS) to measure the vertical morphology and near-edge X-ray absorption fine structure (NEXAFS) spectroscopy to determine the surface composition. Lateral phase-separated domains are observed whose size is correlated to the solar cell performance, while the observed TQ1 surface enrichment does not affect the performance. Changes to the unoccupied molecular orbitals as a result of illumination in ambient air are observed by NEXAFS spectroscopy for PCBM, but not for TQ1. The NEXAFS spectrum of PCBM in a blend with TQ1 changes more than that of pristine PCBM. Solar cells in which the active layer has been illuminated in air prior to the deposition of the top electrode exhibit greatly reduced electrical performance. The valence band and absorption spectrum of TQ1 is affected by illumination in air, but the effects are not large enough to account for losses in solar cell performance, which are mainly attributed to PCBM degradation at the active layer surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Texture based techniques for visualisation of unsteady vector fields have been applied for the visualisation of a Finite volume model for variably saturated groundwater flow through porous media. This model has been developed by staff in the School of Mathematical Sciences QUT for the study of salt water intrusion into coastal aquifers. This presentation discusses the implementation and effectiveness of the IBFV algorithm in the context of visualisation of the groundwater simulation outputs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stereo vision is a method of depth perception, in which depth information is inferred from two (or more) images of a scene, taken from different perspectives. Practical applications for stereo vision include aerial photogrammetry, autonomous vehicle guidance, robotics and industrial automation. The initial motivation behind this work was to produce a stereo vision sensor for mining automation applications. For such applications, the input stereo images would consist of close range scenes of rocks. A fundamental problem faced by matching algorithms is the matching or correspondence problem. This problem involves locating corresponding points or features in two images. For this application, speed, reliability, and the ability to produce a dense depth map are of foremost importance. This work implemented a number of areabased matching algorithms to assess their suitability for this application. Area-based techniques were investigated because of their potential to yield dense depth maps, their amenability to fast hardware implementation, and their suitability to textured scenes such as rocks. In addition, two non-parametric transforms, the rank and census, were also compared. Both the rank and the census transforms were found to result in improved reliability of matching in the presence of radiometric distortion - significant since radiometric distortion is a problem which commonly arises in practice. In addition, they have low computational complexity, making them amenable to fast hardware implementation. Therefore, it was decided that matching algorithms using these transforms would be the subject of the remainder of the thesis. An analytic expression for the process of matching using the rank transform was derived from first principles. This work resulted in a number of important contributions. Firstly, the derivation process resulted in one constraint which must be satisfied for a correct match. This was termed the rank constraint. The theoretical derivation of this constraint is in contrast to the existing matching constraints which have little theoretical basis. Experimental work with actual and contrived stereo pairs has shown that the new constraint is capable of resolving ambiguous matches, thereby improving match reliability. Secondly, a novel matching algorithm incorporating the rank constraint has been proposed. This algorithm was tested using a number of stereo pairs. In all cases, the modified algorithm consistently resulted in an increased proportion of correct matches. Finally, the rank constraint was used to devise a new method for identifying regions of an image where the rank transform, and hence matching, are more susceptible to noise. The rank constraint was also incorporated into a new hybrid matching algorithm, where it was combined a number of other ideas. These included the use of an image pyramid for match prediction, and a method of edge localisation to improve match accuracy in the vicinity of edges. Experimental results obtained from the new algorithm showed that the algorithm is able to remove a large proportion of invalid matches, and improve match accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vector field visualisation is one of the classic sub-fields of scientific data visualisation. The need for effective visualisation of flow data arises in many scientific domains ranging from medical sciences to aerodynamics. Though there has been much research on the topic, the question of how to communicate flow information effectively in real, practical situations is still largely an unsolved problem. This is particularly true for complex 3D flows. In this presentation we give a brief introduction and background to vector field visualisation and comment on the effectiveness of the most common solutions. We will then give some examples of current development on texture-based techniques, and given practical examples of their use in CFD research and hydrodynamic applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detect and Avoid (DAA) technology is widely acknowledged as a critical enabler for unsegregated Remote Piloted Aircraft (RPA) operations, particularly Beyond Visual Line of Sight (BVLOS). Image-based DAA, in the visible spectrum, is a promising technological option for addressing the challenges DAA presents. Two impediments to progress for this approach are the scarcity of available video footage to train and test algorithms, in conjunction with testing regimes and specifications which facilitate repeatable, statistically valid, performance assessment. This paper includes three key contributions undertaken to address these impediments. In the first instance, we detail our progress towards the creation of a large hybrid collision and near-collision encounter database. Second, we explore the suitability of techniques employed by the biometric research community (Speaker Verification and Language Identification), for DAA performance optimisation and assessment. These techniques include Detection Error Trade-off (DET) curves, Equal Error Rates (EER), and the Detection Cost Function (DCF). Finally, the hybrid database and the speech-based techniques are combined and employed in the assessment of a contemporary, image based DAA system. This system includes stabilisation, morphological filtering and a Hidden Markov Model (HMM) temporal filter.