982 resultados para dynamic geometry software
Resumo:
According to a traditional rationalist proposal, it is possible to attain knowledge of certain necessary truths by means of insight—an epistemic mental act that combines the 'presentational' character of perception with the a priori status usually reserved for discursive reasoning. In this dissertation, I defend the insight proposal in relation to a specific subject matter: elementary Euclidean plane geometry, as set out in Book I of Euclid's Elements. In particular, I argue that visualizations and visual experiences of diagrams allow human subjects to grasp truths of geometry by means of visual insight. In the first two chapters, I provide an initial defense of the geometrical insight proposal, drawing on a novel interpretation of Plato's Meno to motivate the view and to reply to some objections. In the remaining three chapters, I provide an account of the psychological underpinnings of geometrical insight, a task that requires considering the psychology of visual imagery alongside the details of Euclid's geometrical system. One important challenge is to explain how basic features of human visual representations can serve to ground our intuitive grasp of Euclid's postulates and other initial assumptions. A second challenge is to explain how we are able to grasp general theorems by considering diagrams that depict only special cases. I argue that both of these challenges can be met by an account that regards geometrical insight as based in visual experiences involving the combined deployment of two varieties of 'dynamic' visual imagery: one that allows the subject to visually rehearse spatial transformations of a figure's parts, and another that allows the subject to entertain alternative ways of structurally integrating the figure as a whole. It is the interplay between these two forms of dynamic imagery that enables a visual experience of a diagram, suitably animated in visual imagination, to justify belief in the propositions of Euclid’s geometry. The upshot is a novel dynamic imagery account that explains how intuitive knowledge of elementary Euclidean plane geometry can be understood as grounded in visual insight.
Resumo:
This work proposes a model to investigate the use of a cylindrical antenna used in the thermal method of recovering through electromagnetic radiation of high-viscosity oil. The antenna has a simple geometry, adapted dipole type, and it can be modelled by using Maxwell s equation. The wavelet transforms are used as basis functions and applied in conjunction with the method of moments to obtain the current distribution in the antenna. The electric field, power and temperature distribution are carefully calculated for the analysis of the antenna as electromagnetic heating. The energy performance is analyzed based on thermo-fluid dynamic simulations at field scale, and through the adaptation in the Steam Thermal and Advanced Processes Reservoir Simulator (STARS) by Computer Modelling Group (CMG). The model proposed and the numerical results obtained are stable and presented good agreement with the results reported in the specialized literature
Resumo:
Technologies for Big Data and Data Science are receiving increasing research interest nowadays. This paper introduces the prototyping architecture of a tool aimed to solve Big Data Optimization problems. Our tool combines the jMetal framework for multi-objective optimization with Apache Spark, a technology that is gaining momentum. In particular, we make use of the streaming facilities of Spark to feed an optimization problem with data from different sources. We demonstrate the use of our tool by solving a dynamic bi-objective instance of the Traveling Salesman Problem (TSP) based on near real-time traffic data from New York City, which is updated several times per minute. Our experiment shows that both jMetal and Spark can be integrated providing a software platform to deal with dynamic multi-optimization problems.
Resumo:
degli elementi vegetali nella dinamica e nella dispersione degli inquinanti nello street canyon urbano. In particolare, è stato analizzata la risposta fluidodinamica di cespugli con altezze diverse e di alberi con porosità e altezza del tronco varianti. Il modello analizzato consiste in due edifici di altezza e larghezza pari ad H e lunghezza di 10H, tra i quali corre una strada in cui sono stati modellizati una sorgente rappresentativa del traffico veicolare e, ai lati, due linee di componenti vegetali. Le simulazioni sono state fatte con ANSYS Fluent, un software di "Computational Fluid Dynamics"(CFD) che ha permesso di modellizare la dinamica dei flussi e di simulare le concentrazioni emesse dalla sorgente di CO posta lungo la strada. Per la simulazione è stato impiegato un modello RANS a chiusura k-epsilon, che permette di parametrizzare i momenti secondi nell'equazione di Navier Stokes per permettere una loro più facile risoluzione. I risultati sono stati espressi in termini di profili di velocità e concentrazione molare di CO, unitamente al calcolo della exchange velocity per quantificare gli scambi tra lo street canyon e l'esterno. Per quanto riguarda l'influenza dell'altezza dei tronchi è stata riscontrata una tendenza non lineare tra di essi e la exchange velocity. Analizzando invece la altezza dei cespugli è stato visto che all'aumentare della loro altezza esiste una relazione univoca con l'abbassamento della exchange velocity. Infine, andando a variare la permeabilità delle chiome degli alberi è stata trovatta una variazione non monotonica che correla la exchange velocity con il parametro C_2, che è stata interpretata attraverso i diversi andamenti dei profili sopravento e sottovento. In conclusione, allo stadio attuale della ricerca presentata in questa tesi, non è ancora possibile correlare direttamente la exchange velocity con alcun parametro analizzato.
Resumo:
With the progress of computer technology, computers are expected to be more intelligent in the interaction with humans, presenting information according to the user's psychological and physiological characteristics. However, computer users with visual problems may encounter difficulties on the perception of icons, menus, and other graphical information displayed on the screen, limiting the efficiency of their interaction with computers. In this dissertation, a personalized and dynamic image precompensation method was developed to improve the visual performance of the computer users with ocular aberrations. The precompensation was applied on the graphical targets before presenting them on the screen, aiming to counteract the visual blurring caused by the ocular aberration of the user's eye. A complete and systematic modeling approach to describe the retinal image formation of the computer user was presented, taking advantage of modeling tools, such as Zernike polynomials, wavefront aberration, Point Spread Function and Modulation Transfer Function. The ocular aberration of the computer user was originally measured by a wavefront aberrometer, as a reference for the precompensation model. The dynamic precompensation was generated based on the resized aberration, with the real-time pupil diameter monitored. The potential visual benefit of the dynamic precompensation method was explored through software simulation, with the aberration data from a real human subject. An "artificial eye'' experiment was conducted by simulating the human eye with a high-definition camera, providing objective evaluation to the image quality after precompensation. In addition, an empirical evaluation with 20 human participants was also designed and implemented, involving image recognition tests performed under a more realistic viewing environment of computer use. The statistical analysis results of the empirical experiment confirmed the effectiveness of the dynamic precompensation method, by showing significant improvement on the recognition accuracy. The merit and necessity of the dynamic precompensation were also substantiated by comparing it with the static precompensation. The visual benefit of the dynamic precompensation was further confirmed by the subjective assessments collected from the evaluation participants.
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.
Resumo:
Monitoring user interaction activities provides the basis for creating a user model that can be used to predict user behaviour and enable user assistant services. The BaranC framework provides components that perform UI monitoring (and collect all associated context data), builds a user model, and supports services that make use of the user model. In this case study, a Next-App prediction service is built to demonstrate the use of the framework and to evaluate the usefulness of such a prediction service. Next-App analyses a user's data, learns patterns, makes a model for a user, and finally predicts based on the user model and current context, what application(s) the user is likely to want to use. The prediction is pro-active and dynamic; it is dynamic both in responding to the current context, and also in that it responds to changes in the user model, as might occur over time as a user's habits change. Initial evaluation of Next-App indicates a high-level of satisfaction with the service.
Resumo:
In this Thesis a series of numerical models for the evaluation of the seasonal performance of reversible air-to-water heat pump systems coupled to residential and non-residential buildings are presented. The exploitation of the energy saving potential linked to the adoption of heat pumps is a hard task for designers due to the influence on their energy performance of several factors, like the external climate variability, the heat pump modulation capacity, the system control strategy and the hydronic loop configuration. The aim of this work is to study in detail all these aspects. In the first part of this Thesis a series of models which use a temperature class approach for the prediction of the seasonal performance of reversible air source heat pumps are shown. An innovative methodology for the calculation of the seasonal performance of an air-to-water heat pump has been proposed as an extension of the procedure reported by the European standard EN 14825. This methodology can be applied not only to air-to-water single-stage heat pumps (On-off HPs) but also to multi-stage (MSHPs) and inverter-driven units (IDHPs). In the second part, dynamic simulation has been used with the aim to optimize the control systems of the heat pump and of the HVAC plant. A series of dynamic models, developed by means of TRNSYS, are presented to study the behavior of On-off HPs, MSHPs and IDHPs. The main goal of these dynamic simulations is to show the influence of the heat pump control strategies and of the lay-out of the hydronic loop used to couple the heat pump to the emitters on the seasonal performance of the system. A particular focus is given to the modeling of the energy losses linked to on-off cycling.
Resumo:
La tesi si divide in due macroargomenti relativi alla preparazione della geometria per modelli MCNP. Il primo è quello degli errori geometrici che vengono generati quando avviene una conversione da formato CAD a CSG e le loro relazioni con il fenomeno delle lost particles. Il passaggio a CSG tramite software è infatti inevitabile per la costruzione di modelli complessi come quelli che vengono usati per rappresentare i componenti di ITER e può generare zone della geometria che non vengono definite in modo corretto. Tali aree causano la perdita di particelle durante la simulazione Monte Carlo, andando ad intaccare l' integrità statistica della soluzione del trasporto. Per questo motivo è molto importante ridurre questo tipo di errori il più possibile, ed in quest'ottica il lavoro svolto è stato quello di trovare metodi standardizzati per identificare tali errori ed infine stimarne le dimensioni. Se la prima parte della tesi è incentrata sui problemi derivanti dalla modellazione CSG, la seconda invece suggerisce un alternativa ad essa, che è l'uso di Mesh non Strutturate (UM), un approccio che sta alla base di CFD e FEM, ma che risulta innovativo nell'ambito di codici Monte Carlo. In particolare le UM sono state applicate ad una porzione dell' Upper Launcher (un componente di ITER) in modo da validare tale metodologia su modelli nucleari di alta complessità. L'approccio CSG tradizionale e quello con UM sono state confrontati in termini di risorse computazionali richieste, velocità, precisione e accuratezza sia a livello di risultati globali che locali. Da ciò emerge che, nonostante esistano ancora alcuni limiti all'applicazione per le UM dovuti in parte anche alla sua novità, vari vantaggi possono essere attribuiti a questo tipo di approccio, tra cui un workflow più lineare, maggiore accuratezza nei risultati locali, e soprattutto la possibilità futura di usare la stessa mesh per diversi tipi di analisi (come quelle termiche o strutturali).
Resumo:
Il carcinoma epatocellulare (HCC) rappresenta il tumore epatico primitivo più comune con una incidenza fino all’85%. È uno dei tumori più frequenti al mondo ed è noto per l’elevata letalità soprattutto in stadio avanzato. La diagnosi precoce attraverso la sorveglianza ecografica è necessaria per migliorare la sopravvivenza dei pazienti a rischio. Il mezzo di contrasto ecografico migliora la sensibilità e la specificità diagnostica dell’ecografia convenzionale. L’ecografia con mezzo di contrasto (contrast-enhanced ultrasound, CEUS) è pertanto considerata una metodica valida per la diagnosi di HCC a livello globale per la sua ottima specificità anche a fronte di una sensibilità subottimale. L’aspetto contrastografico delle lesioni focali epatiche ha portato un team di esperti allo sviluppo del sistema Liver Imaging Reporting and Data System (LI-RADS) con l’obiettivo di standardizzare la raccolta dati e la refertazione delle metodiche di imaging per la diagnosi di HCC. La CEUS è una metodica operatore-dipendente e le discordanze diagnostiche con gli imaging panoramici lasciano spazio a nuove tecniche (Dynamic Contrast Enhanced UltraSound, DCE-US) volte a migliorare l’accuratezza diagnostica della metodica e in particolare la sensibilità. Un software di quantificazione della perfusione tissutale potrebbe essere di aiuto nella pratica clinica per individuare il wash-out non visibile anche all’occhio dell’operatore più esperto. Il nostro studio ha due obiettivi: 1) validare il sistema CEUS LI-RADS nella diagnosi di carcinoma epatocellulare in pazienti ad alto rischio di HCC usando come gold-standard l’istologia quando disponibile oppure metodiche di imaging radiologico accettate da tutte le linee guida (tomografia computerizzata o risonanza magnetica con aspetto tipico) eseguite entro quattro settimane dalla CEUS; 2) valutare l’efficacia di un software di quantificazione della perfusione tissutale nel riscontro di wash-out per la diagnosi di HCC in CEUS.
Resumo:
The weight-transfer effect, consisting of the change in dynamic load distribution between the front and the rear tractor axles, is one of the most impairing phenomena for the performance, comfort, and safety of agricultural operations. Excessive weight transfer from the front to the rear tractor axle can occur during operation or maneuvering of implements connected to the tractor through the three-point hitch (TPH). In this respect, an optimal design of the TPH can ensure better dynamic load distribution and ultimately improve operational performance, comfort, and safety. In this study, a computational design tool (The Optimizer) for the determination of a TPH geometry that minimizes the weight-transfer effect is developed. The Optimizer is based on a constrained minimization algorithm. The objective function to be minimized is related to the tractor front-to-rear axle load transfer during a simulated reference maneuver performed with a reference implement on a reference soil. Simulations are based on a 3-degrees-of-freedom (DOF) dynamic model of the tractor-TPH-implement aggregate. The inertial, elastic, and viscous parameters of the dynamic model were successfully determined through a parameter identification algorithm. The geometry determined by the Optimizer complies with the ISO-730 Standard functional requirements and other design requirements. The interaction between the soil and the implement during the simulated reference maneuver was successfully validated against experimental data. Simulation results show that the adopted reference maneuver is effective in triggering the weight-transfer effect, with the front axle load exhibiting a peak-to-peak value of 27.1 kN during the maneuver. A benchmark test was conducted starting from four geometries of a commercially available TPH. As result, all the configurations were optimized by above 10%. The Optimizer, after 36 iterations, was able to find an optimized TPH geometry which allows to reduce the weight-transfer effect by 14.9%.
Resumo:
Current data indicate that the size of high-density lipoprotein (HDL) may be considered an important marker for cardiovascular disease risk. We established reference values of mean HDL size and volume in an asymptomatic representative Brazilian population sample (n=590) and their associations with metabolic parameters by gender. Size and volume were determined in HDL isolated from plasma by polyethyleneglycol precipitation of apoB-containing lipoproteins and measured using the dynamic light scattering (DLS) technique. Although the gender and age distributions agreed with other studies, the mean HDL size reference value was slightly lower than in some other populations. Both HDL size and volume were influenced by gender and varied according to age. HDL size was associated with age and HDL-C (total population); non- white ethnicity and CETP inversely (females); HDL-C and PLTP mass (males). On the other hand, HDL volume was determined only by HDL-C (total population and in both genders) and by PLTP mass (males). The reference values for mean HDL size and volume using the DLS technique were established in an asymptomatic and representative Brazilian population sample, as well as their related metabolic factors. HDL-C was a major determinant of HDL size and volume, which were differently modulated in females and in males.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
The objective of this study is to verify the dynamics between fiscal policy, measured by public debt, and monetary policy, measured by a reaction function of a central bank. Changes in monetary policies due to deviations from their targets always generate fiscal impacts. We examine two policy reaction functions: the first related to inflation targets and the second related to economic growth targets. We find that the condition for stable equilibrium is more restrictive in the first case than in the second. We then apply our simulation model to Brazil and United Kingdom and find that the equilibrium is unstable in the Brazilian case but stable in the UK case.
Resumo:
Cancer is a multistep process that begins with the transformation of normal epithelial cells and continues with tumor growth, stromal invasion and metastasis. The remodeling of the peritumoral environment is decisive for the onset of tumor invasiveness. This event is dependent on epithelial-stromal interactions, degradation of extracellular matrix components and reorganization of fibrillar components. Our research group has studied in a new proposed rodent model the participation of cellular and molecular components in the prostate microenvironment that contributes to cancer progression. Our group adopted the gerbil Meriones unguiculatus as an alternative experimental model for prostate cancer study. This model has presented significant responses to hormonal treatments and to development of spontaneous and induced neoplasias. The data obtained indicate reorganization of type I collagen fibers and reticular fibers, synthesis of new components such as tenascin and proteoglycans, degradation of basement membrane components and elastic fibers and increased expression of metalloproteinases. Fibroblasts that border the region, apparently participate in the stromal reaction. The roles of each of these events, as well as some signaling molecules, participants of neoplastic progression and factors that promote genetic reprogramming during epithelial-stromal transition are also discussed.