908 resultados para Microhardness machine


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Con il presente studio si è inteso analizzare l’impatto dell’utilizzo di una memoria di traduzione (TM) e del post-editing (PE) di un output grezzo sul livello di difficoltà percepita e sul tempo necessario per ottenere un testo finale di alta qualità. L’esperimento ha coinvolto sei studenti, di madrelingua italiana, del corso di Laurea Magistrale in Traduzione Specializzata dell’Università di Bologna (Vicepresidenza di Forlì). I partecipanti sono stati divisi in tre coppie, a ognuna delle quali è stato assegnato un estratto di comunicato stampa in inglese. Per ogni coppia, ad un partecipante è stato chiesto di tradurre il testo in italiano usando la TM all’interno di SDL Trados Studio 2011. All’altro partecipante è stato chiesto di fare il PE completo in italiano dell’output grezzo ottenuto da Google Translate. Nei casi in cui la TM o l’output non contenevano traduzioni (corrette), i partecipanti avrebbero potuto consultare Internet. Ricorrendo ai Think-aloud Protocols (TAPs), è stato chiesto loro di riflettere a voce alta durante lo svolgimento dei compiti. È stato quindi possibile individuare i problemi traduttivi incontrati e i casi in cui la TM e l’output grezzo hanno fornito soluzioni corrette; inoltre, è stato possibile osservare le strategie traduttive impiegate, per poi chiedere ai partecipanti di indicarne la difficoltà attraverso interviste a posteriori. È stato anche misurato il tempo impiegato da ogni partecipante. I dati sulla difficoltà percepita e quelli sul tempo impiegato sono stati messi in relazione con il numero di soluzioni corrette rispettivamente fornito da TM e output grezzo. È stato osservato che usare la TM ha comportato un maggior risparmio di tempo e che, al contrario del PE, ha portato a una riduzione della difficoltà percepita. Il presente studio si propone di aiutare i futuri traduttori professionisti a scegliere strumenti tecnologici che gli permettano di risparmiare tempo e risorse.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La prima parte del documento contiene una breve introduzione al mondo mobile, cloud computing e social network. La seconda parte si concentra sulla progettazione di un'applicazione per i dispositivi mobili usando le tecnologie Facebook e Parse. Infine, viene implementata un'applicazione Android usando le techiche descritte in precedenza.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In CMS è stato lanciato un progetto di Data Analytics e, all’interno di esso, un’attività specifica pilota che mira a sfruttare tecniche di Machine Learning per predire la popolarità dei dataset di CMS. Si tratta di un’osservabile molto delicata, la cui eventuale predizione premetterebbe a CMS di costruire modelli di data placement più intelligenti, ampie ottimizzazioni nell’uso dello storage a tutti i livelli Tiers, e formerebbe la base per l’introduzione di un solito sistema di data management dinamico e adattivo. Questa tesi descrive il lavoro fatto sfruttando un nuovo prototipo pilota chiamato DCAFPilot, interamente scritto in python, per affrontare questa sfida.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In questa tesi sono stati introdotti e studiati i Big Data, dando particolare importanza al mondo NoSQL, approfondendo MongoDB, e al mondo del Machine Learning, approfondendo PredictionIO. Successivamente è stata sviluppata un'applicazione attraverso l'utilizzo di tecnologie web, nodejs, node-webkit e le tecnologie approfondite prima. L'applicazione utilizza l'interpolazione polinomiale per predirre il prezzo di un bene salvato nello storico presente su MongoDB. Attraverso PredictionIO, essa analizza il comportamento degli altri utenti consigliando dei prodotti per l'acquisto. Infine è stata effetuata un'analisi dei risultati dell'errore prodotto dall'interpolazione.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM: The purpose of this study was to evaluate the activation of resin-modified glass ionomer restorative material (RMGI, Vitremer-3M-ESPE, A3) by halogen lamp (QTH) or light-emitting diode (LED) by Knoop microhardness (KHN) in two storage conditions: 24hrs and 6 months and in two depths (0 and 2 mm). MATERIALS AND METHODS: The specimens were randomly divided into 3 experimental groups (n=10) according to activation form and evaluated in depth after 24h and after 6 months of storage. Activation was performed with QTH for 40s (700 mW/cm2) and for 40 or 20 s with LED (1,200 mW/scm2). After 24 hrs and 6 months of storage at 37°C in relative humidity in lightproof container, the Knoop microhardness test was performed. Statistics Data were analysed by three-way ANOVA and Tukey post-tests (p<0.05). RESULTS: All evaluated factors showed significant differences (p<0.05). After 24 hrs there were no differences within the experimental groups. KHN at 0 mm was significantly higher than 2 mm. After 6 months, there was an increase of microhardness values for all groups, being the ones activated by LED higher than the ones activated by QTH. CONCLUSION: Light-activation with LED positively influenced the KHN for RMGI evaluated after 6 months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Delineating brain tumor boundaries from magnetic resonance images is an essential task for the analysis of brain cancer. We propose a fully automatic method for brain tissue segmentation, which combines Support Vector Machine classification using multispectral intensities and textures with subsequent hierarchical regularization based on Conditional Random Fields. The CRF regularization introduces spatial constraints to the powerful SVM classification, which assumes voxels to be independent from their neighbors. The approach first separates healthy and tumor tissue before both regions are subclassified into cerebrospinal fluid, white matter, gray matter and necrotic, active, edema region respectively in a novel hierarchical way. The hierarchical approach adds robustness and speed by allowing to apply different levels of regularization at different stages. The method is fast and tailored to standard clinical acquisition protocols. It was assessed on 10 multispectral patient datasets with results outperforming previous methods in terms of segmentation detail and computation times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electron Monte Carlo (eMC) dose calculation algorithm available in the Eclipse treatment planning system (Varian Medical Systems) is based on the macro MC method and uses a beam model applicable to Varian linear accelerators. This leads to limitations in accuracy if eMC is applied to non-Varian machines. In this work eMC is generalized to also allow accurate dose calculations for electron beams from Elekta and Siemens accelerators. First, changes made in the previous study to use eMC for low electron beam energies of Varian accelerators are applied. Then, a generalized beam model is developed using a main electron source and a main photon source representing electrons and photons from the scattering foil, respectively, an edge source of electrons, a transmission source of photons and a line source of electrons and photons representing the particles from the scrapers or inserts and head scatter radiation. Regarding the macro MC dose calculation algorithm, the transport code of the secondary particles is improved. The macro MC dose calculations are validated with corresponding dose calculations using EGSnrc in homogeneous and inhomogeneous phantoms. The validation of the generalized eMC is carried out by comparing calculated and measured dose distributions in water for Varian, Elekta and Siemens machines for a variety of beam energies, applicator sizes and SSDs. The comparisons are performed in units of cGy per MU. Overall, a general agreement between calculated and measured dose distributions for all machine types and all combinations of parameters investigated is found to be within 2% or 2 mm. The results of the dose comparisons suggest that the generalized eMC is now suitable to calculate dose distributions for Varian, Elekta and Siemens linear accelerators with sufficient accuracy in the range of the investigated combinations of beam energies, applicator sizes and SSDs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: The aim of this study was to examine the effect of pre-warmed composite on the microhardness and marginal adaptation. Methods: Ninety six identical class II cavities were prepared in extracted human molars and filled/cured in three 2 mm increments using a metal matrix. Two composites (Tetric Evo Ceram (IvoclarVivadent) and ELS(Saremco)) were cured with a LED curing unit (Bluephase (IvoclarVivadent)) using curing cycles of 20 and 40 seconds. The composite was used at room temperature or pre-warmed at 54.5ºC (Calset(AdDent)). Twelve teeth were filled for every composite-curing time-composite temperature combination. The teeth were thermocycled (1000 cycles at 5º and 55ºC) and then stored at 37° C for seven days . Dye penetration (basic fuchsine 5% for 8 hours) was measured using a score scale. Knoop microhardness was determined 100, 200, 500, 1000, 1500, 2500, 3500, 4500 and 5500µm from the occlusal surface at a distance of 150 and 1000µm from the metal matrix. The total degree of polymerization of a composite specimen was determined by calculating the area under the hardness curve. Results: Statistical analyses showed no difference in marginal adaptation (p>0.05). Hardness values at 150µm from the matrix were lower than those at 1000µm. There was an increase of the microhardness at the top of each increment and decrease towards the bottom of each increment. Longer curing times resulted in harder composite samples. Multiple linear regression showed that only the curing time (p<0.001) and composite material (p<0.001) had a significant association with the degree of polymerization. The degree of polymerization was not influenced by pre-warming the composite at a temperature of 54.5ºC (p=4.86). Conclusion: Polymerization time can not be reduced by pre-warming the composite on a temperature of 54.5ºC. The marginal adaptation is not compromised by pre-warming the composite.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. The purpose of this study was to determine the dose profile of the Cranex Tome radiography unit and compare it with that of the Scanora machine.Study design. The radiation dose delivered by the Cranex Tome radiography unit during the cross-sectional mode was determined. Single tooth gaps in regions 3 (16) and 30 (46) were simulated. Dosimetry was carried out with 2 phantoms, a head and neck phantom and a full-body phantom loaded with 142 thermoluminescent dosimeters (TLD) and 280 TLD, respectively; all locations corresponded to radiosensitive organs or tissues. The recorded local mean organ doses were compared with those measured in another study evaluating the Scanora machine.Results. Generally, dose values from the Cranex Tome radiography unit reached only 50% to 60% of the values measured for the Scanora machine. The effective dose was calculated as 0.061 mSv and 0.04 mSv for tooth regions 3 (16) and 30 (46), respectively. Corresponding values for the Scanora machine were 0.117 mSv and 0.084 mSv.Conclusion. Cross-sectional imaging in the molar region of the upper and the lower jaw can be performed with the Cranex Tome unit, which delivers only approximately half of the dose that the Scanora machine delivers.