971 resultados para semi-implicit projection method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oggi sappiamo che la materia ordinaria rappresenta solo una piccola parte dell'intero contenuto in massa dell'Universo. L'ipotesi dell'esistenza della Materia Oscura, un nuovo tipo di materia che interagisce solo gravitazionalmente e, forse, tramite la forza debole, è stata avvalorata da numerose evidenze su scala sia galattica che cosmologica. Gli sforzi rivolti alla ricerca delle cosiddette WIMPs (Weakly Interacting Massive Particles), il generico nome dato alle particelle di Materia Oscura, si sono moltiplicati nel corso degli ultimi anni. L'esperimento XENON1T, attualmente in costruzione presso i Laboratori Nazionali del Gran Sasso (LNGS) e che sarà in presa dati entro la fine del 2015, segnerà un significativo passo in avanti nella ricerca diretta di Materia Oscura, che si basa sulla rivelazione di collisioni elastiche su nuclei bersaglio. XENON1T rappresenta la fase attuale del progetto XENON, che ha già realizzato gli esperimenti XENON10 (2005) e XENON100 (2008 e tuttora in funzione) e che prevede anche un ulteriore sviluppo, chiamato XENONnT. Il rivelatore XENON1T sfrutta circa 3 tonnellate di xeno liquido (LXe) e si basa su una Time Projection Chamber (TPC) a doppia fase. Dettagliate simulazioni Monte Carlo della geometria del rivelatore, assieme a specifiche misure della radioattività dei materiali e stime della purezza dello xeno utilizzato, hanno permesso di predire con accuratezza il fondo atteso. In questo lavoro di tesi, presentiamo lo studio della sensibilità attesa per XENON1T effettuato tramite il metodo statistico chiamato Profile Likelihood (PL) Ratio, il quale nell'ambito di un approccio frequentista permette un'appropriata trattazione delle incertezze sistematiche. In un primo momento è stata stimata la sensibilità usando il metodo semplificato Likelihood Ratio che non tiene conto di alcuna sistematica. In questo modo si è potuto valutare l'impatto della principale incertezza sistematica per XENON1T, ovvero quella sulla emissione di luce di scintillazione dello xeno per rinculi nucleari di bassa energia. I risultati conclusivi ottenuti con il metodo PL indicano che XENON1T sarà in grado di migliorare significativamente gli attuali limiti di esclusione di WIMPs; la massima sensibilità raggiunge una sezione d'urto σ=1.2∙10-47 cm2 per una massa di WIMP di 50 GeV/c2 e per una esposizione nominale di 2 tonnellate∙anno. I risultati ottenuti sono in linea con l'ambizioso obiettivo di XENON1T di abbassare gli attuali limiti sulla sezione d'urto, σ, delle WIMPs di due ordini di grandezza. Con tali prestazioni, e considerando 1 tonnellata di LXe come massa fiduciale, XENON1T sarà in grado di superare gli attuali limiti (esperimento LUX, 2013) dopo soli 5 giorni di acquisizione dati.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Scilla rock avalanche occurred on 6 February 1783 along the coast of the Calabria region (southern Italy), close to the Messina Strait. It was triggered by a mainshock of the Terremoto delle Calabrie seismic sequence, and it induced a tsunami wave responsible for more than 1500 casualties along the neighboring Marina Grande beach. The main goal of this work is the application of semi-analtycal and numerical models to simulate this event. The first one is a MATLAB code expressly created for this work that solves the equations of motion for sliding particles on a two-dimensional surface through a fourth-order Runge-Kutta method. The second one is a code developed by the Tsunami Research Team of the Department of Physics and Astronomy (DIFA) of the Bologna University that describes a slide as a chain of blocks able to interact while sliding down over a slope and adopts a Lagrangian point of view. A wide description of landslide phenomena and in particular of landslides induced by earthquakes and with tsunamigenic potential is proposed in the first part of the work. Subsequently, the physical and mathematical background is presented; in particular, a detailed study on derivatives discratization is provided. Later on, a description of the dynamics of a point-mass sliding on a surface is proposed together with several applications of numerical and analytical models over ideal topographies. In the last part, the dynamics of points sliding on a surface and interacting with each other is proposed. Similarly, different application on an ideal topography are shown. Finally, the applications on the 1783 Scilla event are shown and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By measuring the total crack lengths (TCL) along a gunshot wound channel simulated in ordnance gelatine, one can calculate the energy transferred by a projectile to the surrounding tissue along its course. Visual quantitative TCL analysis of cut slices in ordnance gelatine blocks is unreliable due to the poor visibility of cracks and the likely introduction of secondary cracks resulting from slicing. Furthermore, gelatine TCL patterns are difficult to preserve because of the deterioration of the internal structures of gelatine with age and the tendency of gelatine to decompose. By contrast, using computed tomography (CT) software for TCL analysis in gelatine, cracks on 1-cm thick slices can be easily detected, measured and preserved. In this, experiment CT TCL analyses were applied to gunshots fired into gelatine blocks by three different ammunition types (9-mm Luger full metal jacket, .44 Remington Magnum semi-jacketed hollow point and 7.62 × 51 RWS Cone-Point). The resulting TCL curves reflected the three projectiles' capacity to transfer energy to the surrounding tissue very accurately and showed clearly the typical energy transfer differences. We believe that CT is a useful tool in evaluating gunshot wound profiles using the TCL method and is indeed superior to conventional methods applying physical slicing of the gelatine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The calculation of projection structures (PSs) from Protein Data Bank (PDB)-coordinate files of membrane proteins is not well-established. Reports on such attempts exist but are rare. In addition, the different procedures are barely described and thus difficult if not impossible to reproduce. Here we present a simple, fast and well-documented method for the calculation and visualization of PSs from PDB-coordinate files of membrane proteins: the projection structure visualization (PSV)-method. The PSV-method was successfully validated using the PS of aquaporin-1 (AQP1) from 2D crystals and cryo-transmission electron microscopy, and the PDB-coordinate file of AQP1 determined from 3D crystals and X-ray crystallography. Besides AQP1, which is a relatively rigid protein, we also studied a flexible membrane transport protein, i.e. the L-arginine/agmatine antiporter AdiC. Comparison of PSs calculated from the existing PDB-coordinate files of substrate-free and L-arginine-bound AdiC indicated that conformational changes are detected in projection. Importantly, structural differences were found between the PSV-method calculated PSs of the detergent-solubilized AdiC proteins and the PS from cryo-TEM of membrane-embedded AdiC. These differences are particularly exciting since they may reflect a different conformation of AdiC induced by the lateral pressure in the lipid bilayer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel microfluidic method is proposed for studying diffusion of small molecules in a hydrogel. Microfluidic devices were prepared with semi-permeable microchannels defined by crosslinked poly(ethylene glycol) (PEG). Uptake of dye molecules from aqueous solutions flowing through the microchannels was observedoptically and diffusion of the dye into the hydrogel was quantified. To complement the diffusion measurements from the microfluidic studies, nuclear magnetic resonance(NMR) characterization of the diffusion of dye in the PEG hydrogels was performed. The diffusion of small molecules in a hydrogel is relevant to applications such asdrug delivery and modeling transport for tissue-engineering applications. The diffusion of small molecules in a hydrogel is dependent on the extent of crosslinking within the gel, gel structure, and interactions between the diffusive species and the hydrogel network. These effects were studied in a model environment (semi-infinite slab) at the hydrogelfluid boundary in a microfluidic device. The microfluidic devices containing PEG microchannels were fabricated using photolithography. The unsteady diffusion of small molecules (dyes) within the microfluidic device was monitored and recorded using a digital microscope. The information was analyzed with techniques drawn from digital microscopy and image analysis to obtain concentration profiles with time. Using a diffusion model to fit this concentration vs. position data, a diffusion coefficient was obtained. This diffusion coefficient was compared to those from complementary NMR analysis. A pulsed field gradient (PFG) method was used to investigate and quantify small molecule diffusion in gradient (PFG) method was used to investigate and quantify small molecule diffusion in hydrogels. There is good agreement between the diffusion coefficients obtained from the microfluidic methods and those found from the NMR studies. The microfluidic approachused in this research enables the study of diffusion at length scales that approach those of vasculature, facilitating models for studying drug elution from hydrogels in blood-contacting applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several non-invasive and novel aids for the detection of (and in some cases monitoring of) caries lesions have been introduced in the field of 'caries diagnostics' over the last 15 years. This chapter focusses on those available to dentists at the time of writing; continuing research is bound to lead to further developments in the coming years. Laser fluorescence is based on measurements of back-scattered fluorescence of a 655-nm light source. It enhances occlusal and (potentially) approximal lesion detection and enables semi-quantitative caries monitoring. Systematic reviews have identified false-positive results as a limitation. Quantitative light-induced fluorescence is another sensitive method to quantitatively detect and measure mineral loss both in enamel and some dentine lesions; again, the trade-offs with lower specificity when compared with clinical visual detection must be considered. Subtraction radiography is based on the principle of digitally superimposing two radiographs with exactly the same projection geometry. This method is applicable for approximal surfaces and occlusal caries involving dentine but is not yet widely available. Electrical caries measurements gather either site-specific or surface-specific information of teeth and tooth structure. Fixed-frequency devices perform best for occlusal dentine caries but the method has also shown promise for lesions in enamel and other tooth surfaces with multi-frequency approaches. All methods require further research and further validation in well-designed clinical trials. In the future, they could have useful applications in clinical practice as part of a personalized, comprehensive caries management system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central question for this paper is how to improve the production process by closing the gap between industrial designers and software engineers of television(TV)-based User Interfaces (UI) in an industrial environment. Software engineers are highly interested whether one UI design can be converted into several fully functional UIs for TV products with different screen properties. The aim of the software engineers is to apply automatic layout and scaling in order to speed up and improve the production process. However, the question is whether a UI design lends itself for such automatic layout and scaling. This is investigated by analysing a prototype UI design done by industrial designers. In a first requirements study, industrial designers had created meta-annotations on top of their UI design in order to disclose their design rationale for discussions with software engineers. In a second study, five (out of ten) industrial designers assessed the potential of four different meta-annotation approaches. The question was which annotation method industrial designers would prefer and whether it could satisfy the technical requirements of the software engineering process. One main result is that the industrial designers preferred the method they were already familiar with, which therefore seems to be the most effective one although the main objective of automatic layout and scaling could still not be achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of approximating the 3D scan of a real object through an affine combination of examples. Common approaches depend either on the explicit estimation of point-to-point correspondences or on 2-dimensional projections of the target mesh; both present drawbacks. We follow an approach similar to [IF03] by representing the target via an implicit function, whose values at the vertices of the approximation are used to define a robust cost function. The problem is approached in two steps, by approximating first a coarse implicit representation of the whole target, and then finer, local ones; the local approximations are then merged together with a Poisson-based method. We report the results of applying our method on a subset of 3D scans from the Face Recognition Grand Challenge v.1.0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motion of lung tumors during respiration makes the accurate delivery of radiation therapy to the thorax difficult because it increases the uncertainty of target position. The adoption of four-dimensional computed tomography (4D-CT) has allowed us to determine how a tumor moves with respiration for each individual patient. Using information acquired during a 4D-CT scan, we can define the target, visualize motion, and calculate dose during the planning phase of the radiotherapy process. One image data set that can be created from the 4D-CT acquisition is the maximum-intensity projection (MIP). The MIP can be used as a starting point to define the volume that encompasses the motion envelope of the moving gross target volume (GTV). Because of the close relationship that exists between the MIP and the final target volume, we investigated four MIP data sets created with different methodologies (3 using various 4D-CT sorting implementations, and one using all available cine CT images) to compare target delineation. It has been observed that changing the 4D-CT sorting method will lead to the selection of a different collection of images; however, the clinical implications of changing the constituent images on the resultant MIP data set are not clear. There has not been a comprehensive study that compares target delineation based on different 4D-CT sorting methodologies in a patient population. We selected a collection of patients who had previously undergone thoracic 4D-CT scans at our institution, and who had lung tumors that moved at least 1 cm. We then generated the four MIP data sets and automatically contoured the target volumes. In doing so, we identified cases in which the MIP generated from a 4D-CT sorting process under-represented the motion envelope of the target volume by more than 10% than when measured on the MIP generated from all of the cine CT images. The 4D-CT methods suffered from duplicate image selection and might not choose maximum extent images. Based on our results, we suggest utilization of a MIP generated from the full cine CT data set to ensure a representative inclusive tumor extent, and to avoid geometric miss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In contrast to preoperative brain tumor segmentation, the problem of postoperative brain tumor segmentation has been rarely approached so far. We present a fully-automatic segmentation method using multimodal magnetic resonance image data and patient-specific semi-supervised learning. The idea behind our semi-supervised approach is to effectively fuse information from both pre- and postoperative image data of the same patient to improve segmentation of the postoperative image. We pose image segmentation as a classification problem and solve it by adopting a semi-supervised decision forest. The method is evaluated on a cohort of 10 high-grade glioma patients, with segmentation performance and computation time comparable or superior to a state-of-the-art brain tumor segmentation method. Moreover, our results confirm that the inclusion of preoperative MR images lead to a better performance regarding postoperative brain tumor segmentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE Intense alcohol consumption is a risk factor for a number of health problems. Dual-process models assume that self-regulatory behavior such as drinking alcohol is guided by both reflective and impulsive processes. Evidence suggests that (a) impulsive processes such as implicit attitudes are more strongly associated with behavior when executive functioning abilities are low, and (b) higher neural baseline activation in the lateral prefrontal cortex (PFC) is associated with better inhibitory control. The present study integrates these 2 strands of research to investigate how individual differences in neural baseline activation in the lateral PFC moderate the association between implicit alcohol attitudes and drinking behavior. METHOD Baseline cortical activation was measured with resting electroencephalography (EEG) in 89 moderate drinkers. In a subsequent behavioral testing session they completed measures of implicit alcohol attitudes and self-reported drinking behavior. RESULTS Implicit alcohol attitudes were related to self-reported alcohol consumption. Most centrally, implicit alcohol attitudes were more strongly associated with drinking behavior in individuals with low as compared with high baseline activation in the right lateral PFC. CONCLUSIONS These findings are in line with predictions made on the basis of dual-process models. They provide further evidence that individual differences in neural baseline activation in the right lateral PFC may contribute to executive functioning abilities such as inhibitory control. Moreover, individuals with strongly positive implicit alcohol attitudes coupled with a low baseline activation in the right lateral PFC may be at greater risk of developing unhealthy drinking patterns than others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of liquid argon time projection chambers (LAr TPCs) are being built or are proposed for neutrino experiments on long- and short baseline beams. For these detectors, a distortion in the drift field due to geometrical or physics reasons can affect the reconstruction of the events. Depending on the TPC geometry and electric drift field intensity, this distortion could be of the same magnitude as the drift field itself. Recently, we presented a method to calibrate the drift field and correct for these possible distortions. While straight cosmic ray muon tracks could be used for calibration, multiple coulomb scattering and momentum uncertainties allow only a limited resolution. A UV laser instead can create straight ionization tracks in liquid argon, and allows one to map the drift field along different paths in the TPC inner volume. Here we present a UV laser feed-through design with a steerable UV mirror immersed in liquid argon that can point the laser beam at many locations through the TPC. The straight ionization paths are sensitive to drift field distortions, a fit of these distortion to the linear optical path allows to extract the drift field, by using these laser tracks along the whole TPC volume one can obtain a 3D drift field map. The UV laser feed-through assembly is a prototype of the system that will be used for the MicroBooNE experiment at the Fermi National Accelerator Laboratory (FNAL).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE To investigate whether the effects of hybrid iterative reconstruction (HIR) on coronary artery calcium (CAC) measurements using the Agatston score lead to changes in assignment of patients to cardiovascular risk groups compared to filtered back projection (FBP). MATERIALS AND METHODS 68 patients (mean age 61.5 years; 48 male; 20 female) underwent prospectively ECG-gated, non-enhanced, cardiac 256-MSCT for coronary calcium scoring. Scanning parameters were as follows: Tube voltage, 120 kV; Mean tube current time-product 63.67 mAs (50 - 150 mAs); collimation, 2 × 128 × 0.625 mm. Images were reconstructed with FBP and with HIR at all levels (L1 to L7). Two independent readers measured Agatston scores of all reconstructions and assigned patients to cardiovascular risk groups. Scores of HIR and FBP reconstructions were correlated (Spearman). Interobserver agreement and variability was assessed with ĸ-statistics and Bland-Altmann-Plots. RESULTS Agatston scores of HIR reconstructions were closely correlated with FBP reconstructions (L1, R = 0.9996; L2, R = 0.9995; L3, R = 0.9991; L4, R = 0.986; L5, R = 0.9986; L6, R = 0.9987; and L7, R = 0.9986). In comparison to FBP, HIR led to reduced Agatston scores between 97 % (L1) and 87.4 % (L7) of the FBP values. Using HIR iterations L1 - L3, all patients were assigned to identical risk groups as after FPB reconstruction. In 5.4 % of patients the risk group after HIR with the maximum iteration level was different from the group after FBP reconstruction. CONCLUSION There was an excellent correlation of Agatston scores after HIR and FBP with identical risk group assignment at levels 1 - 3 for all patients. Hence it appears that the application of HIR in routine calcium scoring does not entail any disadvantages. Thus, future studies are needed to demonstrate whether HIR is a reliable method for reducing radiation dose in coronary calcium scoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present article, we report on the semi-quantitative proteome analysis and related changes in protein expression of the MCF-7 breast cancer cell line following treatment with doxorubicin, using the precursor acquisition independent from ion count (PAcIFIC) mass spectrometry method. PAcIFIC represents a cost-effective and easy-to-use proteomics approach, enabling for deep proteome sequencing with minimal sample handling. The acquired proteomic data sets were searched for regulated Reactome pathways and Gene Ontology annotation terms using a new algorithm (SetRank). Using this approach, we identified pathways with significant changes (≤0.05), such as chromatin organization, DNA binding, embryo development, condensed chromosome, sequence-specific DNA binding, response to oxidative stress and response to toxin, as well as others. These sets of pathways are already well-described as being susceptible to chemotherapeutic drugs. Additionally, we found pathways related to neuron development, such as central nervous system neuron differentiation, neuron projection membrane and SNAP receptor activity. These later pathways might indicate biological mechanisms on the molecular level causing the known side-effect of doxorubicin chemotherapy, characterized as cognitive impairment, also called 'chemo brain'. Mass spectrometry data are available via ProteomeXchange with identifier PXD002998.