996 resultados para Pathway Semantics Algorithm (PSA)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the methodology to establish the hemolytic activity of alternative complement pathway as an indicator of the innate immunity in Brazilian fish pacu (Piaractus mesopotamicus), in addition to verifying the influence of β-glucan as an immunostimulant. Fish were fed with diets containing 0, 0.1 and 1% β-glucan, during seven days, and then inoculated with Aeromonas hydrophila. Seven days after the challenge, they were bled for serum extraction. The methodology consisted of a kinetic assay that allows calculating the required time for serum proteins of the complement to promote 50% lysis of a rabbit red blood cell suspension. The method developed in mammals was successfully applied for pacu and determined that the hemolytic activity of the proteins of the complement system (alternative pathway) increased after the pathogen challenge, but was not influenced by the β-glucan treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Hybrid Monte Carlo algorithm is adapted to the simulation of a system of classical degrees of freedom coupled to non self-interacting lattices fermions. The diagonalization of the Hamiltonian matrix is avoided by introducing a path-integral formulation of the problem, in d + 1 Euclidean space–time. A perfect action formulation allows to work on the continuum Euclidean time, without need for a Trotter–Suzuki extrapolation. To demonstrate the feasibility of the method we study the Double Exchange Model in three dimensions. The complexity of the algorithm grows only as the system volume, allowing to simulate in lattices as large as 163 on a personal computer. We conclude that the second order paramagnetic–ferromagnetic phase transition of Double Exchange Materials close to half-filling belongs to the Universality Class of the three-dimensional classical Heisenberg model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective The objective of this study was to develop a clinical nomogram to predict gallium-68 prostate-specific membrane antigen positron emission tomography/computed tomography (68Ga-PSMA-11-PET/CT) positivity in different clinical settings of PSA failure. Materials and methods Seven hundred three (n = 703) prostate cancer (PCa) patients with confirmed PSA failure after radical therapy were enrolled. Patients were stratified according to different clinical settings (first-time biochemical recurrence [BCR]: group 1; BCR after salvage therapy: group 2; biochemical persistence after radical prostatectomy [BCP]: group 3; advanced stage PCa before second-line systemic therapies: group 4). First, we assessed 68Ga-PSMA-11-PET/CT positivity rate. Second, multivariable logistic regression analyses were used to determine predictors of positive scan. Third, regression-based coefficients were used to develop a nomogram predicting positive 68Ga-PSMA-11-PET/CT result and 200 bootstrap resamples were used for internal validation. Fourth, receiver operating characteristic (ROC) analysis was used to identify the most informative nomogram’s derived cut-off. Decision curve analysis (DCA) was implemented to quantify nomogram’s clinical benefit. Results 68Ga-PSMA-11-PET/CT overall positivity rate was 51.2%, while it was 40.3% in group 1, 54% in group 2, 60.5% in group 3, and 86.9% in group 4 (p < 0.001). At multivariable analyses, ISUP grade, PSA, PSA doubling time, and clinical setting were independent predictors of a positive scan (all p ≤ 0.04). A nomogram based on covariates included in the multivariate model demonstrated a bootstrap-corrected accuracy of 82%. The nomogram-derived best cut-off value was 40%. In DCA, the nomogram revealed clinical net benefit of > 10%. Conclusions This novel nomogram proved its good accuracy in predicting a positive scan, with values ≥ 40% providing the most informative cut-off in counselling patients to 68Ga-PSMA-11-PET/CT. This tool might be important as a guide to clinicians in the best use of PSMA-based PET imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Hippo pathway is a well-known master regulator of cell growth and proliferation. Many studies have shed light on the centrality of Hippo functions, as this signalling is able to respond to different stimuli and translate them into distinct transcriptional outputs. Therefore, it is clearly implicated in a number of important processes, which alteration has consequences on the correct specification of the single cell, as well as the whole tissue. Even if the core of the signalling has been extensively characterized, it remains unclear which are the “co-workers” that permit the Hippo pathway to answer to so many different stimuli and act as a coordinator of the growth/differentiation balance. Taking advantage of the Drosophila model, which has witnessed most of the discoveries on this signalling pathway, this thesis aims to add some new knowledge about the Hippo pathway molecular mechanisms in different contexts, from development to disease. In the first part I studied the dynamics of the Hippo core kinase protein Warts in the development of the pupal eye. I have found out a critical time point in which the expression and the localization of Warts change suddenly, suggesting the intervention of upstream regulators modulating its activity in an extremely narrow time window. The second goal was investigating the role of the Hippo pathway in the neurodegenerative Gaucher disease. Indeed, I have produced some preliminary results which demonstrate a growth deficit associated with a massive reduction of some Yki targets, supporting a Hyper-Hippo condition underlying this neuropathic syndrome. Finally, I have evaluated the transcription factor Orthodenticle as a co-factor of Yorkie in driving tissue overgrowth, and my findings support a model of interaction of these two molecules based on Yki conformational changes. Altogether, my results lay the foundation for new important studies on the molecular mechanisms ruling Hippo pathway activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I simulatori di guida sono strumenti altamente tecnologici che permettono di svolgere attività di ricerca in vari ambiti quali la psicologia, la medicina e l’ingegneria. Tuttavia, affinché i dati ottenuti mediante le simulazioni siano rapportabili alla loro controparte reale, la fedeltà delle componenti del simulatore di guida deve essere elevata. Questo lavoro tratta del miglioramento del sistema di restituzione del movimento nel simulatore a due gradi di libertà (2DOF) SIMU-LACET Driving Simulator, costruito e sviluppato presso il laboratorio LEPSIS dell’IFSTTAR (Istituto Francese delle Scienze e Tecnologie dei Trasporti, dello Sviluppo e delle Reti), in particolare nella sua sede di Parigi – Marne-la-Vallée. Si è deciso di andare a riprogettare la parte software del sistema di restituzione del movimento (motion cueing), operando su due elementi principali: lo scale factor (fattore di scala) applicato agli impulsi dinamici provenienti dal modello veicolare e i Motion Cueing Algorihms (MCA, algoritmi di restituzione del movimento), questo per entrambi i gradi di libertà. Si è quindi intervenuti sul modello esistente implementato in MATLAB-Simulink nello specifico blocco del motion cueing sul surge (traslazione longitudinale) e sul yaw (imbardata). Riguardo lo scale factor, è stata introdotta una metodologia per creare uno scale factor non lineare in forma esponenziale, tale da migliorare la restituzione degli impulsi meno ampi, pur rispettando i limiti fisici della piattaforma di movimento. Per quanto concerne il MCA, si sono vagliate diverse transfer function dell’algoritmo classico. La scelta finale dei MCA e la validazione del motion cueig in genere è stata effettuata mediante due esperimenti ed il giudizio dei soggetti che vi hanno partecipato. Inoltre, in virtù dei risultati del primo esperimento, si è investigata l’influenza che la strategia in merito al cambio delle marce avesse sulla percezione del movimento da parte del guidatore.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on finding the optimum block cutting dimensions in terms of the environmental and economic factors by using a 3D algorithm for a limestone quarry in Foggia, Italy. The environmental concerns of quarrying operations are mainly: energy consumption, material waste, and pollution. The main economic concerns are the block recovery, the selling prices, and the production costs. Fractures adversely affect the block recovery ratio. With a fracture model, block production can be optimized. In this research, the waste volume produced by quarrying was minimised to increase the recovery ratio and ensure economic benefits. SlabCutOpt is a software developed at DICAM–University of Bologna for block cutting optimization which tests different cutting angles on the x-y-z planes to offer up alternative cutting methods. The program tests several block sizes and outputs the optimal result for each entry. By using SlabCutOpt, ten different block dimensions were analysed, the results indicated the maximum number of non-intersecting blocks for each dimension. After analysing the outputs, the block named number 1 with the dimensions ‘1mx1mx1m’ had the highest recovery ratio as 43% and the total Relative Money Value (RMV) with a value of 22829. Dimension number 1, also had the lowest waste volume, with a value of 3953.25 m3, for the total bench. For cutting the total bench volume of 6932.25m3, the diamond wire cutter had the lowest dust emission values for the block with the dimension ‘2mx2mx2m’, with a value of 24m3. When compared with the Eco-Label standards, block dimensions having surface area values lower than 15m2, were found to fit the natural resource waste criteria of the label, as the threshold required 25% of minimum recovery [1]. Due to the relativity of production costs, together with the Eco-Label threshold, the research recommends the selection of the blocks with a surface area value between 6m2 and 14m2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the analysis and management of emergency healthcare processes through the use of advanced analytics and optimization approaches. Emergency processes are among the most complex within healthcare. This is due to their non-elective nature and their high variability. This thesis is divided into two topics. The first one concerns the core of emergency healthcare processes, the emergency department (ED). In the second chapter, we describe the ED that is the case study. This is a real case study with data derived from a large ED located in northern Italy. In the next two chapters, we introduce two tools for supporting ED activities. The first one is a new type of analytics model. Its aim is to overcome the traditional methods of analyzing the activities provided in the ED by means of an algorithm that analyses the ED pathway (organized as event log) as a whole. The second tool is a decision-support system, which integrates a deep neural network for the prediction of patient pathways, and an online simulator to evaluate the evolution of the ED over time. Its purpose is to provide a set of solutions to prevent and solve the problem of the ED overcrowding. The second part of the thesis focuses on the COVID-19 pandemic emergency. In the fifth chapter, we describe a tool that was used by the Bologna local health authority in the first part of the pandemic. Its purpose is to analyze the clinical pathway of a patient and from this automatically assign them a state. Physicians used the state for routing the patients to the correct clinical pathways. The last chapter is dedicated to the description of a MIP model, which was used for the organization of the COVID-19 vaccination campaign in the city of Bologna, Italy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water Distribution Networks (WDNs) play a vital importance rule in communities, ensuring well-being band supporting economic growth and productivity. The need for greater investment requires design choices will impact on the efficiency of management in the coming decades. This thesis proposes an algorithmic approach to address two related problems:(i) identify the fundamental asset of large WDNs in terms of main infrastructure;(ii) sectorize large WDNs into isolated sectors in order to respect the minimum service to be guaranteed to users. Two methodologies have been developed to meet these objectives and subsequently they were integrated to guarantee an overall process which allows to optimize the sectorized configuration of WDN taking into account the needs to integrated in a global vision the two problems (i) and (ii). With regards to the problem (i), the methodology developed introduces the concept of primary network to give an answer with a dual approach, of connecting main nodes of WDN in terms of hydraulic infrastructures (reservoirs, tanks, pumps stations) and identifying hypothetical paths with the minimal energy losses. This primary network thus identified can be used as an initial basis to design the sectors. The sectorization problem (ii) has been faced using optimization techniques by the development of a new dedicated Tabu Search algorithm able to deal with real case studies of WDNs. For this reason, three new large WDNs models have been developed in order to test the capabilities of the algorithm on different and complex real cases. The developed methodology also allows to automatically identify the deficient parts of the primary network and dynamically includes new edges in order to support a sectorized configuration of the WDN. The application of the overall algorithm to the new real case studies and to others from literature has given applicable solutions even in specific complex situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground deformation provides valuable insights on subsurface processes with pattens reflecting the characteristics of the source at depth. In active volcanic sites displacements can be observed in unrest phases; therefore, a correct interpretation is essential to assess the hazard potential. Inverse modeling is employed to obtain quantitative estimates of parameters describing the source. However, despite the robustness of the available approaches, a realistic imaging of these reservoirs is still challenging. While analytical models return quick but simplistic results, assuming an isotropic and elastic crust, more sophisticated numerical models, accounting for the effects of topographic loads, crust inelasticity and structural discontinuities, require much higher computational effort and information about the crust rheology may be challenging to infer. All these approaches are based on a-priori source shape constraints, influencing the solution reliability. In this thesis, we present a new approach aimed at overcoming the aforementioned limitations, modeling sources free of a-priori shape constraints with the advantages of FEM simulations, but with a cost-efficient procedure. The source is represented as an assembly of elementary units, consisting in cubic elements of a regular FE mesh loaded with a unitary stress tensors. The surface response due to each of the six stress tensor components is computed and linearly combined to obtain the total displacement field. In this way, the source can assume potentially any shape. Our tests prove the equivalence of the deformation fields due to our assembly and that of corresponding cavities with uniform boundary pressure. Our ability to simulate pressurized cavities in a continuum domain permits to pre-compute surface responses, avoiding remeshing. A Bayesian trans-dimensional inversion algorithm implementing this strategy is developed. 3D Voronoi cells are used to sample the model domain, selecting the elementary units contributing to the source solution and those remaining inactive as part of the crust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’importanza delle api per la vita sulla Terra ed il rischio alle quali sono sottoposte per via dell’azione dell’uomo sono ormai un dato di fatto. La concezione antropocentrica della natura e l’allevamento al solo fine produttivo di questi piccoli insetti, ha da sempre danneggiato il loro habitat e interferito con i loro cicli biologici. L’apicoltura, nata come un rapporto mutualistico in cui l’uomo offriva un rifugio alle api e loro in cambio provvedevano al suo nutrimento, si è trasformato in una dannosa dipendenza ed in un assoggettamento di questi insetti ai ritmi artificiali e tutt’altro che naturali della produzione rapida e seriale volta all’ottenimento di un profitto. Un’evidente prova di questa condizione, sono i rifugi per le api, le arnie. Ci siamo mai chiesti perché le arnie hanno questa forma? È quella che preferiscono le api, o quella che rende più pratici e veloci processi di costruzione, gestione e produzione? In natura le api colonizzano cavità quali tronchi cavi di alberi, forme lontane, per non dire diametralmente opposte a quelle in cui le vediamo vivere negli allevamenti. In questa ottica, il design e le nuove tecnologie, poste al servizio della Natura, conducono ad un punto di incontro tra le esigenze umane e quelle degli altri esseri viventi, delle api in questo caso. I concetti di Additive Manufacturing e Design Computazionale, permettono processi di produzione simili a quelli evolutivi naturali e trovano per questa motivazione un’applicazione ideale per progetti che si pongono come fine quello di discostarsi da una visione troppo artificiale, per riavvicinarsi alla perfezione e all’armonia delle leggi della Natura.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Driving simulators emulate a real vehicle drive in a virtual environment. One of the most challenging problems in this field is to create a simulated drive as real as possible to deceive the driver's senses and cause the believing to be in a real vehicle. This thesis first provides an overview of the Stuttgart driving simulator with a description of the overall system, followed by a theoretical presentation of the commonly used motion cueing algorithms. The second and predominant part of the work presents the implementation of the classical and optimal washout algorithms in a Simulink environment. The project aims to create a new optimal washout algorithm and compare the obtained results with the results of the classical washout. The classical washout algorithm, already implemented in the Stuttgart driving simulator, is the most used in the motion control of the simulator. This classical algorithm is based on a sequence of filters in which each parameter has a clear physical meaning and a unique assignment to a single degree of freedom. However, the effects on human perception are not exploited, and each parameter must be tuned online by an engineer in the control room, depending on the driver's feeling. To overcome this problem and also consider the driver's sensations, the optimal washout motion cueing algorithm was implemented. This optimal control-base algorithm treats motion cueing as a tracking problem, forcing the accelerations perceived in the simulator to track the accelerations that would have been perceived in a real vehicle, by minimizing the perception error within the constraints of the motion platform. The last chapter presents a comparison between the two algorithms, based on the driver's feelings after the test drive. Firstly it was implemented an off-line test with a step signal as an input acceleration to verify the behaviour of the simulator. Secondly, the algorithms were executed in the simulator during a test drive on several tracks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increase in load demand for various sectors, protection and safety of the network are key factors that have to be taken into consideration over the electric grid and distribution network. A phasor Measuring unit is an Intelligent electronics device that collects the data in the form of a real-time synchrophasor with a precise time tag using GPS (Global positioning system) and transfers the data to the grid command to monitor and assess the data. The measurements made by PMU have to be very precise to protect the relays and measuring equipment according to the IEEE 60255-118-1(2018). As a device PMU is very expensive to research and develop new functionalities there is a need to find an alternative to working with. Hence many open source virtual libraries are available to replicate the exact function of PMU in the virtual environment(Software) to continue the research on multiple objectives, providing the very least error results when verified. In this thesis, I executed performance and compliance verification of the virtual PMU which was developed using the I-DFT (Interpolated Discrete Fourier transforms) C-class algorithm in MATLAB. In this thesis, a test environment has been developed in MATLAB and tested the virtually developed PMU on both steady state and dynamic state for verifying the latest standard compliance(IEEE-60255-118-1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'efficienza dei computer ha un limite inferiore dettato dal principio di Landauer. La distruzione di una qualsiasi informazione ha un costo energetico per la dissipazione dei bit che la formavano. L'unico modo per aggirare il principio di Landauer è attraverso la reversibilità. Questa tecnica di computazione ci permette di eseguire un programma senza dover distruggere informazioni e quindi, senza dissipare bit. Molti algoritmi ai giorni nostri hanno un grande impatto energetico sul mondo, ed uno di questi è SHA256, l'algoritmo usato nella block-chain di Bitcoin. Questa tesi si pone l'obbiettivo di analizzare il consumo energetico e di memoria di una implementazione di SHA256 reversibile ideale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nei prossimi anni è atteso un aggiornamento sostanziale di LHC, che prevede di aumentare la luminosità integrata di un fattore 10 rispetto a quella attuale. Tale parametro è proporzionale al numero di collisioni per unità di tempo. Per questo, le risorse computazionali necessarie a tutti i livelli della ricostruzione cresceranno notevolmente. Dunque, la collaborazione CMS ha cominciato già da alcuni anni ad esplorare le possibilità offerte dal calcolo eterogeneo, ovvero la pratica di distribuire la computazione tra CPU e altri acceleratori dedicati, come ad esempio schede grafiche (GPU). Una delle difficoltà di questo approccio è la necessità di scrivere, validare e mantenere codice diverso per ogni dispositivo su cui dovrà essere eseguito. Questa tesi presenta la possibilità di usare SYCL per tradurre codice per la ricostruzione di eventi in modo che sia eseguibile ed efficiente su diversi dispositivi senza modifiche sostanziali. SYCL è un livello di astrazione per il calcolo eterogeneo, che rispetta lo standard ISO C++. Questo studio si concentra sul porting di un algoritmo di clustering dei depositi di energia calorimetrici, CLUE, usando oneAPI, l'implementazione SYCL supportata da Intel. Inizialmente, è stato tradotto l'algoritmo nella sua versione standalone, principalmente per prendere familiarità con SYCL e per la comodità di confronto delle performance con le versioni già esistenti. In questo caso, le prestazioni sono molto simili a quelle di codice CUDA nativo, a parità di hardware. Per validare la fisica, l'algoritmo è stato integrato all'interno di una versione ridotta del framework usato da CMS per la ricostruzione. I risultati fisici sono identici alle altre implementazioni mentre, dal punto di vista delle prestazioni computazionali, in alcuni casi, SYCL produce codice più veloce di altri livelli di astrazione adottati da CMS, presentandosi dunque come una possibilità interessante per il futuro del calcolo eterogeneo nella fisica delle alte energie.