975 resultados para fixed-time AI
Resumo:
Riconosciuto il problema dell’accesso ai farmaci come un problema di giustizia globale, la dissertazione, da un lato, è incentrata sullo studio dei diritti umani e sul diritto alla salute da una prospettiva giusfilosofica e, dall’altro, è finalizzata ad analizzare la disciplina brevettuale internazionale, sia approfondendo gli interessi realmente in gioco, sia studiando la struttura economica del brevetto stesso. Si è cercato quindi di guardare a tali interessi da una nuova prospettiva, ipotizzando una gerarchia di valori che sia completa e coerente con gli obiettivi che la dottrina, la giurisprudenza, nonché il diritto internazionale formalmente enunciano. Il progetto di ricerca vuole, in definitiva, arrivare a proporre nuove soluzioni giuridiche al problema dell’accesso ai farmaci. La dissertazione svolge pertanto uno studio critico della proposta di Thomas Pogge, di natura politica e giuridica e sorretta da istanze filosofiche, volta alla soluzione del problema dell’accesso ai farmaci, i.e. l’Health Impact Fund (HIF). Proposta che pone radicalmente in discussione, anche concretamente, il dogma del monopolio concesso con la privativa quale ricompensa per i costi di R&D sostenuti dai titolari dei brevetti e che pone, invece, l’accento sull’effettivo impatto sulla salute globale di ogni singola invenzione. Analizzandone approfonditamente gli aspetti più rilevanti, si passano poi in rassegna, criticamente, le proposte, alternative o di riforma, del sistema di proprietà intellettuale, volte al miglioramento dell’accesso ai farmaci; a tal proposito, si propone quindi una riforma transitoria della disciplina brevettuale, c.d. Trading Time for Space (TTS), che prevede un allungamento temporale dell’esclusiva brevettuale (Time) in cambio della vendita da parte del titolare della privativa del farmaco ad un prezzo accessibile nei Paesi in via di sviluppo (Space).
Resumo:
Con il presente studio si è inteso analizzare l’impatto dell’utilizzo di una memoria di traduzione (TM) e del post-editing (PE) di un output grezzo sul livello di difficoltà percepita e sul tempo necessario per ottenere un testo finale di alta qualità. L’esperimento ha coinvolto sei studenti, di madrelingua italiana, del corso di Laurea Magistrale in Traduzione Specializzata dell’Università di Bologna (Vicepresidenza di Forlì). I partecipanti sono stati divisi in tre coppie, a ognuna delle quali è stato assegnato un estratto di comunicato stampa in inglese. Per ogni coppia, ad un partecipante è stato chiesto di tradurre il testo in italiano usando la TM all’interno di SDL Trados Studio 2011. All’altro partecipante è stato chiesto di fare il PE completo in italiano dell’output grezzo ottenuto da Google Translate. Nei casi in cui la TM o l’output non contenevano traduzioni (corrette), i partecipanti avrebbero potuto consultare Internet. Ricorrendo ai Think-aloud Protocols (TAPs), è stato chiesto loro di riflettere a voce alta durante lo svolgimento dei compiti. È stato quindi possibile individuare i problemi traduttivi incontrati e i casi in cui la TM e l’output grezzo hanno fornito soluzioni corrette; inoltre, è stato possibile osservare le strategie traduttive impiegate, per poi chiedere ai partecipanti di indicarne la difficoltà attraverso interviste a posteriori. È stato anche misurato il tempo impiegato da ogni partecipante. I dati sulla difficoltà percepita e quelli sul tempo impiegato sono stati messi in relazione con il numero di soluzioni corrette rispettivamente fornito da TM e output grezzo. È stato osservato che usare la TM ha comportato un maggior risparmio di tempo e che, al contrario del PE, ha portato a una riduzione della difficoltà percepita. Il presente studio si propone di aiutare i futuri traduttori professionisti a scegliere strumenti tecnologici che gli permettano di risparmiare tempo e risorse.
Resumo:
Im Bereich sicherheitsrelevanter eingebetteter Systeme stellt sich der Designprozess von Anwendungen als sehr komplex dar. Entsprechend einer gegebenen Hardwarearchitektur lassen sich Steuergeräte aufrüsten, um alle bestehenden Prozesse und Signale pünktlich auszuführen. Die zeitlichen Anforderungen sind strikt und müssen in jeder periodischen Wiederkehr der Prozesse erfüllt sein, da die Sicherstellung der parallelen Ausführung von größter Bedeutung ist. Existierende Ansätze können schnell Designalternativen berechnen, aber sie gewährleisten nicht, dass die Kosten für die nötigen Hardwareänderungen minimal sind. Wir stellen einen Ansatz vor, der kostenminimale Lösungen für das Problem berechnet, die alle zeitlichen Bedingungen erfüllen. Unser Algorithmus verwendet Lineare Programmierung mit Spaltengenerierung, eingebettet in eine Baumstruktur, um untere und obere Schranken während des Optimierungsprozesses bereitzustellen. Die komplexen Randbedingungen zur Gewährleistung der periodischen Ausführung verlagern sich durch eine Zerlegung des Hauptproblems in unabhängige Unterprobleme, die als ganzzahlige lineare Programme formuliert sind. Sowohl die Analysen zur Prozessausführung als auch die Methoden zur Signalübertragung werden untersucht und linearisierte Darstellungen angegeben. Des Weiteren präsentieren wir eine neue Formulierung für die Ausführung mit fixierten Prioritäten, die zusätzlich Prozessantwortzeiten im schlimmsten anzunehmenden Fall berechnet, welche für Szenarien nötig sind, in denen zeitliche Bedingungen an Teilmengen von Prozessen und Signalen gegeben sind. Wir weisen die Anwendbarkeit unserer Methoden durch die Analyse von Instanzen nach, welche Prozessstrukturen aus realen Anwendungen enthalten. Unsere Ergebnisse zeigen, dass untere Schranken schnell berechnet werden können, um die Optimalität von heuristischen Lösungen zu beweisen. Wenn wir optimale Lösungen mit Antwortzeiten liefern, stellt sich unsere neue Formulierung in der Laufzeitanalyse vorteilhaft gegenüber anderen Ansätzen dar. Die besten Resultate werden mit einem hybriden Ansatz erzielt, der heuristische Startlösungen, eine Vorverarbeitung und eine heuristische mit einer kurzen nachfolgenden exakten Berechnungsphase verbindet.
Resumo:
La diffusione dei servizi cloud ha spinto anche il mondo degli IDE verso questa direzione. Recentemente si sta assistendo allo spostamento degli IDE da ambienti desktop ad ambienti Web. Questo è determinante per quanto riguarda gli aspetti legati alla collaborazione perchè permette di sfruttare tutti i vantaggi del cloud per dotare questi sistemi di chat, integrazione con i social network, strumenti di editing condiviso e molte altre funzionalità collaborative. Questi IDE sono detti browser-based in quanto i servizi che mettono a disposizione sono accessibili via Web tramite un browser. Ne esistono di diversi tipi e con caratteristiche molto diverse tra di loro. Alcuni sono semplici piattaforme sulle quali è possibile effettuare test di codice o utilizzare tutorial forniti per imparare nuovi linguaggi di programmazione; altri invece sono ambienti di sviluppo completi dotati delle più comuni funzionalità presenti in un IDE desktop, oltre a quelle specifiche legate al Web. Dallo studio di questi ambienti di sviluppo di nuova generazione è emerso che sono pochi quelli che dispongono di un sistema di collaborazione completo e che non tutti sfruttano le nuove tecnologie che il Web mette a disposizione. Per esempio, alcuni sono dotati di editor collaborativi, ma non offrono un servizio di chat ai collaboratori; altri mettono a disposizione una chat e il supporto per la scrittura simultanea di codice, ma non sono dotati di sistemi per la condivisione del display. Dopo l'analisi dei pregi e dei difetti della collaborazione fornita dagli strumenti presi in considerazione ho deciso di realizzare delle funzionalità collaborative inserendomi nel contesto di un IDE browser-based chiamato InDe RT sviluppato dall'azienda Pro Gamma SpA.
Resumo:
Descrizione del lavoro svolto nella modellazione zero-dimensionale e ai valori medi di un motopropulsore. Lo scopo finale del modello è un funzionamento HIL.
Resumo:
In questo lavoro si è tentato di fornire un metodo per la calibrazione di modelli numerici in analisi dinamiche spettrali. Attraverso una serie di analisi time history non lineari sono stati ottenuti gli spostamenti relativi orizzontali che nascono, in corrispondenza della connessione trave-pilastro di tipo attritivo, quando una struttura prefabbricata monopiano viene investita dalla componente orizzontale e verticale del sisma. Con un procedimento iterativo su varie analisi spettrali sono state calibrate delle rigidezze equivalenti che hanno permesso di ottenere, con buona approssimazione, gli stessi risultati delle analisi time history. Tali rigidezze sono state poi restituite in forma grafica. Per riprodurre gli spostamenti relativi orizzontali con un’analisi dinamica spettrale è quindi possibile collegare le travi ai pilastri con degli elementi elastici aventi rigidezza Kcoll. I valori di rigidezza restituiti da questo studio valgono per un’ampia gamma di prefabbricati monopiano (periodo proprio 0.20s < T < 2.00s) e tre differenti livelli di intensità sismica; inoltre è stata data la possibilità di considerare la plasticizzazione alla base dei pilastri e di scegliere fra due diverse posizioni nei confronti della rottura di faglia (Near Fault System o Far Fault System). La diminuzione di forza d’attrito risultante (a seguito della variazione dell’accelerazione verticale indotta dal sisma) è stata presa in considerazione utilizzando un modello in cui fra trave e pilastro è posto un isolatore a pendolo inverso (opportunamente calibrato per funzionare come semplice appoggio ad attrito). Con i modelli lineari equivalenti si riescono ad ottenere buoni risultati in tempi relativamente ridotti: è possibile così compiere delle valutazioni approssimate sulla perdita di appoggio e sulle priorità d’intervento in una determinata zona sismica.
Resumo:
Objectives: To evaluate the biological and technical complication rates of fixed dental prostheses (FDP) with end abutments or cantilever extensions on teeth (FDP-tt/cFDP-tt) on implants (FDP-ii/cFDP-ii) and tooth-implant-supported (FDP-ti/cFDP-ti) in patients treated for chronic periodontitis. Material and methods: From a cohort of 392 patients treated between 1978 and 2002 by graduate students, 199 were re-examined in 2005. Of these, 84 patients had received ceramo-metal FDPs (six groups). Results: At the re-evaluation, the mean age of the patients was 62 years (36.2–83.4). One hundred and seventy-five FDPs were seated (82 FDP-tt, 9 FDP-ii, 20 FDP-ti, 39 cFDP-tt, 15 cFDP-ii, 10 cFDP-ti). The mean observation time was 11.3 years; 21 FDPs were lost, and 46 technical and 50 biological complications occurred. Chances for the survival of the three groups of FDPs with end abutments were very high (risk for failure 2.8%, 0%, 5.6%). The probability to remain without complications and/or failure was 70.3%, 88.9% and 74.7% in FDPs with end abutments, but 49.8–25% only in FDPs with extensions at 10 years. Conclusions: In patients treated for chronic periodontitis and provided with ceramo-metal FDPs, high survival rates, especially for FDPs with end abutments, can be expected. The incidence rates of any negative events were increased drastically in the three groups with extension cFDPs (tt, ii, ti). Strategic decisions in the choice of a particular FDP design and the choice of teeth/implants as abutments appear to influence the risks for complications to be expected with fixed reconstruction. If possible, extensions on tooth abutments should be avoided or used only after a cautious clinical evaluation of all options.
Resumo:
Background Synchronization programs have become standard in the dairy industry in many countries. In Switzerland, these programs are not routinely used for groups of cows, but predominantly as a therapy for individual problem cows. The objective of this study was to compare the effect of a CIDR-Select Synch and a 12-d CIDR protocol on the pregnancy rate in healthy, multiparous dairy cows in Swiss dairy farms. Methods Cows (N = 508) were randomly assigned to CIDR-Select Synch (N = 262) or 12-d CIDR (N = 246) protocols. Cows in the CIDR-Select Synch group received a CIDR and 2.5 ml of buserelin i.m. on d 0. On d 7, the CIDR insert was removed and 5 ml of dinoprost was administered i.m.. Cows in the 12-d CIDR group received the CIDR on d 0 and it was removed on d 12 (the routine CIDR protocol in Swiss dairies). On d 0 a milk sample for progesterone analysis was taken. Cows were inseminated upon observed estrus. Pregnancy was determined at or more than 35 days after artificial insemination. As a first step, the two groups were compared as to indication for treatment, breed, stud book, stall, pasture, and farmer's business using chi square tests or Fisher's exact test. Furthermore, groups were compared as to age, DIM, number of AI's, number of cows per farm, and yearly milk yield per cow using nonparametric ANOVA. A multiple logistic model was used to relate the success of the protocols to all of the available factors; in particular treatment (CIDR-Select Synch/12-d CIDR), milk progesterone value, age, DIM, previous treatment of the uterus, previous gynecological treatment, and number of preceding inseminations. Results The pregnancy rate was higher in cows following the CIDR-Select Synch compared to the 12-d CIDR protocol (50.4% vs. 22.4%; P < 0.0001). Conclusion The CIDR-Select Synch protocol may be highly recommended for multiparous dairy cows. The reduced time span of the progesterone insert decreased the number of days open, improved the pregnancy rate compared to the 12-d CIDR protocol and the cows did not to have to be handled more often.
Resumo:
The present study aims to investigate the implications of web-based delivery of identical learning content for time efficiency and students' performance, as compared to conventional textbook resources.
Resumo:
We developed a real-time PCR which allowed the highly sensitive detection of Naegleria fowleri in histological brain tissue sections from experimentally infected mice. This genus-specific small-subunit (18S) rRNA gene-based PCR can complement conventional (immuno-) histology for the diagnosis of primary amoebic meningoencephalitis in paraffin-embedded brain necropsy specimens that had been fixed in formalin buffered with phosphate-buffered saline.
Resumo:
Searching for the neural correlates of visuospatial processing using functional magnetic resonance imaging (fMRI) is usually done in an event-related framework of cognitive subtraction, applying a paradigm comprising visuospatial cognitive components and a corresponding control task. Besides methodological caveats of the cognitive subtraction approach, the standard general linear model with fixed hemodynamic response predictors bears the risk of being underspecified. It does not take into account the variability of the blood oxygen level-dependent signal response due to variable task demand and performance on the level of each single trial. This underspecification may result in reduced sensitivity regarding the identification of task-related brain regions. In a rapid event-related fMRI study, we used an extended general linear model including single-trial reaction-time-dependent hemodynamic response predictors for the analysis of an angle discrimination task. In addition to the already known regions in superior and inferior parietal lobule, mapping the reaction-time-dependent hemodynamic response predictor revealed a more specific network including task demand-dependent regions not being detectable using the cognitive subtraction method, such as bilateral caudate nucleus and insula, right inferior frontal gyrus and left precentral gyrus.
Resumo:
Markov chain Monte Carlo is a method of producing a correlated sample in order to estimate features of a complicated target distribution via simple ergodic averages. A fundamental question in MCMC applications is when should the sampling stop? That is, when are the ergodic averages good estimates of the desired quantities? We consider a method that stops the MCMC sampling the first time the width of a confidence interval based on the ergodic averages is less than a user-specified value. Hence calculating Monte Carlo standard errors is a critical step in assessing the output of the simulation. In particular, we consider the regenerative simulation and batch means methods of estimating the variance of the asymptotic normal distribution. We describe sufficient conditions for the strong consistency and asymptotic normality of both methods and investigate their finite sample properties in a variety of examples.
Resumo:
The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.
Resumo:
This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.
Resumo:
In this report, we attempt to define the capabilities of the infrared satellite remote sensor, Multifunctional Transport Satellite-2 (MTSAT-2) (i.e. a geosynchronous instrument), in characterizing volcanic eruptive behavior in the highly active region of Indonesia. Sulfur dioxide data from NASA's Ozone Monitoring Instrument (OMI) (i.e. a polar orbiting instrument) are presented here for validation of the processes interpreted using the thermal infrared datasets. Data provided from two case studies are analyzed specifically for eruptive products producing large thermal anomalies (i.e. lava flows, lava domes, etc.), volcanic ash and SO2 clouds; three distinctly characteristic and abundant volcanic emissions. Two primary methods used for detection of heat signatures are used and compared in this report including, single-channel thermal radiance (4-µm) and the normalized thermal index (NTI) algorithm. For automated purposes, fixed thresholds must be determined for these methods. A base minimum detection limit (MDL) for single-channel thermal radiance of 2.30E+05 Wm- 2sr-1m-1 and -0.925 for NTI generate false alarm rates of 35.78% and 34.16%, respectively. A spatial comparison method, developed here specifically for use in Indonesia and used as a second parameter for detection, is implemented to address the high false alarm rate. For the single-channel thermal radiance method, the utilization of the spatial comparison method eliminated 100% of the false alarms while maintaining every true anomaly. The NTI algorithm showed similar results with only 2 false alarms remaining. No definitive difference is observed between the two thermal detection methods for automated use; however, the single-channel thermal radiance method coupled with the SO2 mass abundance data can be used to interpret volcanic processes including the identification of lava dome activity at Sinabung as well as the mechanism for the dome emplacement (i.e. endogenous or exogenous). Only one technique, the brightness temperature difference (BTD) method, is used for the detection of ash. Trends of ash area, water/ice area, and their respective concentrations yield interpretations of increased ice formation, aggregation, and sedimentation processes that only a high-temporal resolution instrument like the MTSAT-2 can analyze. A conceptual model of a secondary zone of aggregation occurring in the migrating Kelut ash cloud, which decreases the distal fine-ash component and hazards to flight paths, is presented in this report. Unfortunately, SO2 data was unable to definitively reinforce the concept of a secondary zone of aggregation due to the lack of a sufficient temporal resolution. However, a detailed study of the Kelut SO2 cloud is used to determine that there was no climatic impacts generated from this eruption due to the atmospheric residence times and e-folding rate of ~14 days for the SO2. This report applies the complementary assets offered by utilizing a high-temporal and a high-spatial resolution satellite, and it demonstrates that these two instruments can provide unparalleled observations of dynamic volcanic processes.