910 resultados para Software of dinamic geometry
Resumo:
The Distributed Software Development (DSD) is a development strategy that meets the globalization needs concerned with the increase productivity and cost reduction. However, the temporal distance, geographical dispersion and the socio-cultural differences, increased some challenges and, especially, added new requirements related with the communication, coordination and control of projects. Among these new demands there is the necessity of a software process that provides adequate support to the distributed software development. This paper presents an integrated approach of software development and test that considers distributed teams peculiarities. The approach purpose is to offer support to DSD, providing a better project visibility, improving the communication between the development and test teams, minimizing the ambiguity and difficulty to understand the artifacts and activities. This integrated approach was conceived based on four pillars: (i) to identify the DSD peculiarities concerned with development and test processes, (ii) to define the necessary elements to compose the integrated approach of development and test to support the distributed teams, (iii) to describe and specify the workflows, artifacts, and roles of the approach, and (iv) to represent appropriately the approach to enable the effective communication and understanding of it.
Resumo:
We undertook geometric morphometric analysis of wing venation to assess this character's ability to distinguish Anopheles darlingi Root populations and to test the hypothesis that populations from coastal areas of the Brazilian Atlantic Forest differ from those of the interior Atlantic Forest, Cerrado, and the regions South and North of the Amazon River. Results suggest that populations from the coastal and interior Atlantic Forest are more similar to each other than to any of the other regional populations. Notably, the Cerrado population was more similar to that from north of the Amazon River than to that collected of south of the River. thus showing no correlation with geographical distances. We hypothesize that environmental and ecological factors may affect wing evolution in An. darlingi. Although it is premature to associate environmental and ecological determinants with wing features and evolution of the species, investigations on this field are promising. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The influence of test method factors (notch shape, square or angular, and pre-cracking method, by tapping onto or pressing a razor blade) on the results obtained in plane strain fracture toughness test according to standard ASTM D5045 using SENB specimens made of a commercial PMMA resin were investigated. Results were analyzed quantitatively by comparing the obtained K-IC values and qualitatively by observing their effect on the Moire fringes observed using photoelasticity, showing that, at 95% significance level, the K-IC values are affected by the pre-cracking method, with the most conservative value being obtained when natural pre-cracks were introduced by tapping onto a razor blade (K-IC = 1.15 +/- 0.11 MPa.m(0.5)). This correlates with a perturbation in the stress field close to the pre-crack tip observed in the photoelasticity test sample when it was introduced by pressing the razor blade. Surprisingly, notch geometry only slightly affects the results. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
PURPOSE: Apply the educational software Fuzzy Kitten with undergraduate Brazilian nursing students. METHODS: This software, based on fuzzy logic, generates performance scores that evaluate the ability to identify defining characteristics/risk factors present in clinical cases, relate them with nursing diagnoses, and determine the diagnoses freely or using a decision support model. FINDINGS: There were differences in student performance compared to the year of the course. The time to perform the activity did not present a significant relation to the performance. The students' scores in the diagnoses indicated by the model was superior (p = .01). CONCLUSIONS: The software was able to evaluate the diagnostic accuracy of students. IMPLICATIONS: The software enables an objective evaluation of diagnostic accuracy.
Resumo:
The software Seed Vigor Imaging System (SVIS®), has been successfully used to evaluate seed physiological potential by automated analyses of scanned seedlings. In this research, the efficiency of this system was compared to other tests accepted for assessing cucumber (Cucumis sativus L.) seed vigor of distinct seed lots of Supremo and Safira cultivars. Seeds were subjected to germination, traditional and saturated salt accelerated aging, seedling emergence, seedling length and SVIS analyses (determination of vigor indices and seedling growth uniformity, lengths of primary root, hypocotyl and whole seedlings). It was also determined whether the definition of seedling growth/uniformity ratios affects the sensitivity of the SVIS®. Results showed that analyses SVIS have provided consistent identification of seed lots performance, and have produced information comparable to those from recommended seed vigor tests, thus demonstrating a suitable sensitivity for a rapid and objective evaluation of physiological potential of cucumber seeds. Analyses of four-days-old cucumber seedlings using the SVIS® are more accurate and growth/uniformity does not affect the precision of results.
Resumo:
Abstract Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Resumo:
Photodynamic therapy (PDT) is a treatment modality that has advanced rapidly in recent years. It causes tissue and vascular damage with the interaction of a photosensitizing agent (PS), light of a proper wavelength, and molecular oxygen. Evaluation of vessel damage usually relies on histopathology evaluation. Results are often qualitative or at best semi-quantitative based on a subjective system. The aim of this study was to evaluate, using CD31 immunohistochem- istry and image analysis software, the vascular damage after PDT in a well-established rodent model of chemically induced mammary tumor. Fourteen Sprague-Dawley rats received a single dose of 7,12-dimethylbenz(a)anthraxcene (80 mg/kg by gavage), treatment efficacy was evaluated by comparing the vascular density of tumors after treatment with Photogem® as a PS, intraperitoneally, followed by interstitial fiber optic lighting, from a diode laser, at 200 mW/cm and light dose of 100 J/cm directed against his tumor (7 animals), with a control group (6 animals, no PDT). The animals were euthanized 30 hours after the lighting and mammary tumors were removed and samples from each lesion were formalin-fixed. Immunostained blood vessels were quantified by Image Pro-Plus version 7.0. The control group had an average of 3368.6 ± 4027.1 pixels per picture and the treated group had an average of 779 ± 1242.6 pixels per area (P < 0.01), indicating that PDT caused a significant decrease in vascular density of mammary tumors. The combined immu- nohistochemistry using CD31, with selection of representative areas by a trained pathology, followed by quantification of staining using Image Pro-Plus version 7.0 system was a practical and robust methodology for vessel damage evalua- tion, which probably could be used to assess other antiangiogenic treatments.
Resumo:
[ES] El reto de conseguir una red eléctrica más eficiente pasa por la introducción masiva de energías renovables en la red eléctrica, disminuyendo así las emisiones de CO2. Por ello, se propone no sólo controlar la producción, como se ha hecho hasta ahora, sino que también se propone controlar la demanda. Por ello, en esta investigación se evalúa el uso de la Ingeniería Dirigida por Modelos para gestionar la complejidad en el modelado de redes eléctricas, la Inteligencia de Negocio para analizar la gran cantidad de datos de simulaciones y la Inteligencia Colectiva para optimizar el reparto de energía entre los millones de dispositivos que se encuentran en el lado de la demanda.
Resumo:
ALICE, that is an experiment held at CERN using the LHC, is specialized in analyzing lead-ion collisions. ALICE will study the properties of quarkgluon plasma, a state of matter where quarks and gluons, under conditions of very high temperatures and densities, are no longer confined inside hadrons. Such a state of matter probably existed just after the Big Bang, before particles such as protons and neutrons were formed. The SDD detector, one of the ALICE subdetectors, is part of the ITS that is composed by 6 cylindrical layers with the innermost one attached to the beam pipe. The ITS tracks and identifies particles near the interaction point, it also aligns the tracks of the articles detected by more external detectors. The two ITS middle layers contain the whole 260 SDD detectors. A multichannel readout board, called CARLOSrx, receives at the same time the data coming from 12 SDD detectors. In total there are 24 CARLOSrx boards needed to read data coming from all the SDD modules (detector plus front end electronics). CARLOSrx packs data coming from the front end electronics through optical link connections, it stores them in a large data FIFO and then it sends them to the DAQ system. Each CARLOSrx is composed by two boards. One is called CARLOSrx data, that reads data coming from the SDD detectors and configures the FEE; the other one is called CARLOSrx clock, that sends the clock signal to all the FEE. This thesis contains a description of the hardware design and firmware features of both CARLOSrx data and CARLOSrx clock boards, which deal with all the SDD readout chain. A description of the software tools necessary to test and configure the front end electronics will be presented at the end of the thesis.
Resumo:
Il concetto di “sostenibilità” si riferisce allo sviluppo dei sistemi umani attraverso il più piccolo impatto possibile sul sistema ambientale. Le opere che si inseriscono bene nel contesto ambientale circostante e le pratiche che rispettano le risorse in maniera tale da permettere una crescita e uno sviluppo a lungo termine senza impattare sull’ambiente sono indispensabili in una società moderna. I progressi passati, presenti e futuri che hanno reso i conglomerati bituminosi materiali sostenibili dal punto di vista ambientale sono particolarmente importanti data la grande quantità di conglomerato usato annualmente in Europa e negli Stati Uniti. I produttori di bitume e di conglomerato bituminoso stanno sviluppando tecniche innovative per ridurre l’impatto ambientale senza compromettere le prestazioni meccaniche finali. Un conglomerato bituminoso ad “alta lavorabilità” (WMA), pur sviluppando le stesse caratteristiche meccaniche, richiede un temperatura di produzione minore rispetto a quella di un tradizionale conglomerato bituminoso a caldo (HMA). L’abbassamento della temperature di produzione riduce le emissioni nocive. Questo migliora le condizioni dei lavoratori ed è orientato verso uno sviluppo sostenibile. L’obbiettivo principale di questa tesi di laurea è quello di dimostrare il duplice valore sia dal punto di vista dell’eco-compatibilità sia dal punto di vista meccanico di questi conglomerati bituminosi ad “alta lavorabilità”. In particolare in questa tesi di laurea è stato studiato uno SMA ad “alta lavorabilità” (PGGWMA). L’uso di materiali a basso impatto ambientale è la prima fase verso un progetto ecocompatibile ma non può che essere il punto di partenza. L’approccio ecocompatibile deve essere esteso anche ai metodi di progetto e alla caratterizzazione di laboratorio dei materiali perché solo in questo modo è possibile ricavare le massime potenzialità dai materiali usati. Un’appropriata caratterizzazione del conglomerato bituminoso è fondamentale e necessaria per una realistica previsione delle performance di una pavimentazione stradale. La caratterizzazione volumetrica (Mix Design) e meccanica (Deformazioni Permanenti e Comportamento a fatica) di un conglomerato bituminoso è una fase importante. Inoltre, al fine di utilizzare correttamente i materiali, un metodo di progetto avanzato ed efficiente, come quello rappresentato da un approccio Empirico-Meccanicistico (ME), deve essere utilizzato. Una procedura di progetto Empirico-Meccanicistica consiste di un modello strutturale capace di prevedere gli stati di tensione e deformazione all’interno della pavimentazione sotto l’azione del traffico e in funzione delle condizioni atmosferiche e di modelli empirici, calibrati sul comportamento dei materiali, che collegano la risposta strutturale alle performance della pavimentazione. Nel 1996 in California, per poter effettivamente sfruttare i benefici dei continui progressi nel campo delle pavimentazioni stradali, fu iniziato un estensivo progetto di ricerca mirato allo sviluppo dei metodi di progetto Empirico - Meccanicistici per le pavimentazioni stradali. Il risultato finale fu la prima versione del software CalME che fornisce all’utente tre approcci diversi di l’analisi e progetto: un approccio Empirico, uno Empirico - Meccanicistico classico e un approccio Empirico - Meccanicistico Incrementale - Ricorsivo. Questo tesi di laurea si concentra sulla procedura Incrementale - Ricorsiva del software CalME, basata su modelli di danno per quanto riguarda la fatica e l’accumulo di deformazioni di taglio dai quali dipendono rispettivamente la fessurazione superficiale e le deformazioni permanenti nella pavimentazione. Tale procedura funziona per incrementi temporali successivi e, usando i risultati di ogni incremento temporale, ricorsivamente, come input dell’incremento temporale successivo, prevede le condizioni di una pavimentazione stradale per quanto riguarda il modulo complesso dei diversi strati, le fessurazioni superficiali dovute alla fatica, le deformazioni permanenti e la rugosità superficiale. Al fine di verificare le propreità meccaniche del PGGWMA e le reciproche relazioni in termini di danno a fatica e deformazioni permanenti tra strato superficiale e struttura della pavimentazione per fissate condizioni ambientali e di traffico, è stata usata la procedura Incrementale – Ricorsiva del software CalME. Il conglomerato bituminoso studiato (PGGWMA) è stato usato in una pavimentazione stradale come strato superficiale di 60 mm di spessore. Le performance della pavimentazione sono state confrontate a quelle della stessa pavimentazione in cui altri tipi di conglomerato bituminoso sono stati usati come strato superficiale. I tre tipi di conglomerato bituminoso usati come termini di paragone sono stati: un conglomerato bituminoso ad “alta lavorabilità” con granulometria “chiusa” non modificato (DGWMA), un conglomerato bituminoso modificato con polverino di gomma con granulometria “aperta” (GGRAC) e un conglomerato bituminoso non modificato con granulometria “chiusa” (DGAC). Nel Capitolo I è stato introdotto il problema del progetto ecocompatibile delle pavimentazioni stradali. I materiali a basso impatto ambientale come i conglomerati bituminosi ad “alta lavorabilità” e i conglomerati bituminosi modificati con polverino di gomma sono stati descritti in dettaglio. Inoltre è stata discussa l’importanza della caratterizzazione di laboratorio dei materiali e il valore di un metodo razionale di progetto delle pavimentazioni stradali. Nel Capitolo II sono stati descritti i diversi approcci progettuali utilizzabili con il CalME e in particolare è stata spiegata la procedura Incrementale – Ricorsiva. Nel Capitolo III sono state studiate le proprietà volumetriche e meccaniche del PGGWMA. Test di Fatica e di Deformazioni Permanenti, eseguiti rispettivamente con la macchina a fatica per flessione su quattro punti e il Simple Shear Test device (macchina di taglio semplice), sono stati effettuati su provini di conglomerato bituminoso e i risultati dei test sono stati riassunti. Attraverso questi dati di laboratorio, i parametri dei modelli della Master Curve, del danno a fatica e dell’accumulo di deformazioni di taglio usati nella procedura Incrementale – Ricorsiva del CalME sono stati valutati. Infine, nel Capitolo IV, sono stati presentati i risultati delle simulazioni di pavimentazioni stradali con diversi strati superficiali. Per ogni pavimentazione sono stati analizzati la fessurazione superficiale complessiva, le deformazioni permanenti complessive, il danno a fatica e la profondità delle deformazioni in ognuno degli stati legati.
Resumo:
In this investigation I look at patents and software agents as a way to study broader relation between law and science (the latter term broadly understood as inclusive of science and technology). The overall premise framing the entire discussion, my basic thesis, is that this relation, between law and science, cannot be understood without taking into account a number of intervening factors identifying which makes it necessary to approach the question from the standpoint of fields and disciplines other than law and science themselves.
Resumo:
In the last years, the importance of locating people and objects and communicating with them in real time has become a common occurrence in every day life. Nowadays, the state of the art of location systems for indoor environments has not a dominant technology as instead occurs in location systems for outdoor environments, where GPS is the dominant technology. In fact, each location technology for indoor environments presents a set of features that do not allow their use in the overall application scenarios, but due its characteristics, it can well coexist with other similar technologies, without being dominant and more adopted than the others indoor location systems. In this context, the European project SELECT studies the opportunity of collecting all these different features in an innovative system which can be used in a large number of application scenarios. The goal of this project is to realize a wireless system, where a network of fixed readers able to query one or more tags attached to objects to be located. The SELECT consortium is composed of European institutions and companies, including Datalogic S.p.A. and CNIT, which deal with software and firmware development of the baseband receiving section of the readers, whose function is to acquire and process the information received from generic tagged objects. Since the SELECT project has an highly innovative content, one of the key stages of the system design is represented by the debug phase. This work aims to study and develop tools and techniques that allow to perform the debug phase of the firmware of the baseband receiving section of the readers.