989 resultados para Free Software Movement
Resumo:
VariScan is a software package for the analysis of DNA sequence polymorphisms at the whole genome scale. Among other features, the software:(1) can conduct many population genetic analyses; (2) incorporates a multiresolution wavelet transform-based method that allows capturing relevant information from DNA polymorphism data; and (3) it facilitates the visualization of the results in the most commonly used genome browsers.
Resumo:
Tässä työssä tutkitaan ohjelmistoarkkitehtuurisuunnitteluominaisuuksien vaikutusta erään client-server –arkkitehtuuriin perustuvan mobiilipalvelusovelluksen suunnittelu- ja toteutusaikaan. Kyseinen tutkimus perustuu reaalielämän projektiin, jonka kvalitatiivinen analyysi paljasti arkkitehtuurikompponenttien välisten kytkentöjen merkittävästi vaikuttavan projektin työmäärään. Työn päätavoite oli kvantitatiivisesti tutkia yllä mainitun havainnon oikeellisuus. Tavoitteen saavuttamiseksi suunniteltiin ohjelmistoarkkitehtuurisuunnittelun mittaristo kuvaamaan kyseisen järjestelmän alijärjestelmien arkkitehtuuria ja luotiin kaksi suunniteltua mittaristoa käyttävää, työmäärää (komponentin suunnittelu-, toteutus- ja testausaikojen summa) arvioivaa mallia, joista toinen on lineaarinen ja toinen epälineaarinen. Näiden mallien kertoimet sovitettiin optimoimalla niiden arvot epälineaarista gloobaalioptimointimenetelmää, differentiaalievoluutioalgoritmia, käyttäen, niin että mallien antamat arvot vastasivat parhaiten mitattua työmäärää sekä kaikilla ominaisuuksilla eli attribuuteilla että vain osalla niistä (yksi jätettiin vuorotellen pois). Kun arkkitehtuurikompenttien väliset kytkennät jätettiin malleista pois, mitattujen ja arvoitujen työmäärien välinen ero (ilmaistuna virheenä) kasvoi eräässä tapauksessa 367 % entisestä tarkoittaen sitä, että näin muodostettu malli vastasi toteutusaikoja huonosti annetulla ainestolla. Tämä oli suurin havaitu virhe kaikkien poisjätettyjen ominaisuuksien kesken. Saadun tuloksen perusteella päätettiin, että kyseisen järjestelmän toteutusajat ovat vahvasti riippuvaisia kytkentöjen määrästä, ja näin ollen kytkentöjen määrä oli mitä todennäköisemmin kaikista tärkein työmäärään vaikuttava tekijä tutkitun järjestelmän arkkitehtuurisuunnittelussa.
Resumo:
Free induction decay (FID) navigators were found to qualitatively detect rigid-body head movements, yet it is unknown to what extent they can provide quantitative motion estimates. Here, we acquired FID navigators at different sampling rates and simultaneously measured head movements using a highly accurate optical motion tracking system. This strategy allowed us to estimate the accuracy and precision of FID navigators for quantification of rigid-body head movements. Five subjects were scanned with a 32-channel head coil array on a clinical 3T MR scanner during several resting and guided head movement periods. For each subject we trained a linear regression model based on FID navigator and optical motion tracking signals. FID-based motion model accuracy and precision was evaluated using cross-validation. FID-based prediction of rigid-body head motion was found to be with a mean translational and rotational error of 0.14±0.21 mm and 0.08±0.13(°) , respectively. Robust model training with sub-millimeter and sub-degree accuracy could be achieved using 100 data points with motion magnitudes of ±2 mm and ±1(°) for translation and rotation. The obtained linear models appeared to be subject-specific as inter-subject application of a "universal" FID-based motion model resulted in poor prediction accuracy. The results show that substantial rigid-body motion information is encoded in FID navigator signal time courses. Although, the applied method currently requires the simultaneous acquisition of FID signals and optical tracking data, the findings suggest that multi-channel FID navigators have a potential to complement existing tracking technologies for accurate rigid-body motion detection and correction in MRI.
Resumo:
Though Free Radicals is one of the most frequently explored scientific subjects in mass communication media, the topic is absent of many Biochemistry introductory courses, especially those in which the students do not have a good chemical background. To overcome this contradictory situation we have developed a software treating this topic in a very simple way. The software is divided in four sections: (1) definition and description of free radicals, (2) production pathways, (3) mechanism of action and (4) enzymatic and non enzymatic protection. The instructional capacity of the software has been both qualitative and quantitatively evaluated through its application in undergraduate courses. The software is available in the INTERNET at the site: http://www.unicamp.br/ib/bioquimica/ensino.
Resumo:
In the context of the evidence-based practices movement, the emphasis on computing effect sizes and combining them via meta-analysis does not preclude the demonstration of functional relations. For the latter aim, we propose to augment the visual analysis to add consistency to the decisions made on the existence of a functional relation without losing sight of the need for a methodological evaluation of what stimuli and reinforcement or punishment are used to control the behavior. Four options for quantification are reviewed, illustrated, and tested with simulated data. These quantifications include comparing the projected baseline with the actual treatment measurements, on the basis of either parametric or nonparametric statistics. The simulated data used to test the quantifications include nine data patterns in terms of the presence and type of effect and comprising ABAB and multiple baseline designs. Although none of the techniques is completely flawless in terms of detecting a functional relation only when it is present but not when it is absent, an option based on projecting split-middle trend and considering data variability as in exploratory data analysis proves to be the best performer for most data patterns. We suggest that the information on whether a functional relation has been demonstrated should be included in meta-analyses. It is also possible to use as a weight the inverse of the data variability measure used in the quantification for assessing the functional relation. We offer an easy to use code for open-source software for implementing some of the quantifications.
Resumo:
El departament d’electrònica i telecomunicacions de la Universitat de Vic ha dissenyat un conjunt de plaques entrenadores amb finalitat educativa. Perquè els alumnes puguin utilitzar aquestes plaques com a eina d’estudi, és necessari disposar d’un sistema de gravació econòmic i còmode. La major part dels programadors, en aquest cas, no compleixen amb aquests requeriments. L’objectiu d’aquest projecte és dissenyar un sistema de programació que utilitzi la comunicació sèrie i que no requereixi d'un hardware ni software específics. D’aquesta manera, obtenim una placa autònoma i un programador gratuït, de muntatge ràpid i simple d’utilitzar. El sistema de gravació dissenyat s’ha dividit en tres blocs. Per una banda, un programa que anomenem “programador” encarregat de transferir codi de programa des de l’ordinador al microcontrolador de la placa entrenadora. Per altra banda, un programa anomenat “bootloader”, situat al microcontrolador, permet rebre aquest codi de programa i emmagatzemar-lo a les direccions de memòria de programa corresponents. Com a tercer bloc, s’implementa un protocol de comunicació i un sistema de control d’errors per tal d’assegurar una correcta comunicació entre el “programador” i el “bootloader”. Els objectius d’aquest projecte s’han complert i per les proves realitzades, el sistema de programació ha funcionat correctament.
Resumo:
The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA), Principal Component Analysis (PCA), intervals Principal Component Analysis (iPCA), as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.
Resumo:
Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.
Resumo:
Open source and open source software development have been interesting phenomena during the past decade. Traditional business models do not apply with open source, where the actual product is free. However, it is possible to make business with open source, even successfully, but the question is: how? The aim of this study is to find the key factors of successfully making business out of commercial open source software development. The task is achieved by finding the factors that influence open source projects, finding the relation between those factors, and find out why some factors explain the success more than others. The literature review concentrates first on background of open innovation, open source and open source software. Then business models, critical success factors and success measures are examined. Based on existing literature a framework was created. The framework contains categorized success factors that influence software projects in general as well as open source software projects. The main categories of success factors in software business are divided into community management, technology management, project management and market management. In order to find out which of the factors based on the existing literature are the most critical, empirical research was done by conducting unstructured personal interviews. The main finding based on the interviews is that the critical success factors in open source software business do not differ from those in traditional software business or in fact from those in any other business. Some factors in the framework came out in the interviews that can be considered as key factors: establishing and communicating hierarchy (community management), localization (technology management), good license know-how and IPR management (project management), and effective market management (market management). The critical success factors according to the interviewees are not listed in the framework: low price, good product and good business model development.
Resumo:
The interaction mean free path between neutrons and TRISO particles is simulated using scripts written in MATLAB to solve the increasing error present with an increase in the packing factor in the reactor physics code Serpent. Their movement is tracked both in an unbounded and in a bounded space. Their track is calculated, depending on the program, linearly directly using the position vectors of the neutrons and the surface equations of all the fuel particles; by dividing the space in multiple subspaces, each of which contain a fraction of the total number of particles, and choosing the particles from those subspaces through which the neutron passes through; or by choosing the particles that lie within an infinite cylinder formed on the movement axis of the neutron. The estimate from the current analytical model, based on an exponential distribution, for the mean free path, utilized by Serpent, is used as a reference result. The results from the implicit model in Serpent imply a too long mean free path with high packing factors. The received results support this observation by producing, with a packing factor of 17 %, approximately 2.46 % shorter mean free path compared to the reference model. This is supported by the packing factor experienced by the neutron, the simulation of which resulted in a 17.29 % packing factor. It was also observed that the neutrons leaving from the surfaces of the fuel particles, in contrast to those starting inside the moderator, do not follow the exponential distribution. The current model, as it is, is thus not valid in the determination of the free path lengths of the neutrons.
Resumo:
Objective: Overuse injuries in violinists are a problem that has been primarily analyzed through the use of questionnaires. Simultaneous 3D motion analysis and EMG to measure muscle activity has been suggested as a quantitative technique to explore this problem by identifying movement patterns and muscular demands which may predispose violinists to overuse injuries. This multi-disciplinary analysis technique has, so far, had limited use in the music world. The purpose of this study was to use it to characterize the demands of a violin bowing task. Subjects: Twelve injury-free violinists volunteered for the study. The subjects were assigned to a novice or expert group based on playing experience, as determined by questionnaire. Design and Settings: Muscle activity and movement patterns were assessed while violinists played five bowing cycles (one bowing cycle = one down-bow + one up-bow) on each string (G, D, A, E), at a pulse of 4 beats per bow and 100 beats per minute. Measurements: An upper extremity model created using coordinate data from markers placed on the right acromion process, lateral epicondyle of the humerus and ulnar styloid was used to determine minimum and maximum joint angles, ranges of motion (ROM) and angular velocities at the shoulder and elbow of the bowing arm. Muscle activity in right anterior deltoid, biceps brachii and triceps brachii was assessed during maximal voluntary contractions (MVC) and during the playing task. Data were analysed for significant differences across the strings and between experience groups. Results: Elbow flexion/extension ROM was similar across strings for both groups. Shoulder flexion/extension ROM increaslarger for the experts. Angular velocity changes mirrored changes in ROM. Deltoid was the most active of the muscles assessed (20% MVC) and displayed a pattern of constant activation to maintain shoulder abduction. Biceps and triceps were less active (4 - 12% MVC) and showed a more periodic 'on and off pattern. Novices' muscle activity was higher in all cases. Experts' muscle activity showed a consistent pattern across strings, whereas the novices were more irregular. The agonist-antagonist roles of biceps and triceps during the bowing motion were clearly defined in the expert group, but not as apparent in the novice group. Conclusions: Bowing movement appears to be controlled by the shoulder rather than the elbow as shoulder ROM changed across strings while elbow ROM remained the same. Shoulder injuries are probably due to repetition as the muscle activity required for the movement is small. Experts require a smaller amount of muscle activity to perform the movement, possibly due to more efficient muscle activation patterns as a result of practice. This quantitative multidisciplinary approach to analysing violinists' movements can contribute to fuller understanding of both playing demands and injury mechanisms .
Resumo:
L’objectif principal de cette thèse de doctorat en composition consiste en la conceptualisation et création d’une oeuvre pour guitare électrique et dispositif électronique se situant à un point de rencontre stylistique entre le free jazz, la musique électro glitch et le drone métal. Le présent texte vise principalement à expliciter la démarche de création de cette oeuvre, intitulée feedback. À la suite du chapitre d'introduction qui décrit l'origine et les objectifs de ce projet de composition, le premier chapitre est consacré à une réflexion sur l'improvisation musicale. En partant d'une description générale de cette pratique et de ses origines, cinq manifestations de l'improvisation qui ont influencé l’oeuvre y sont décrites. Dans le deuxième chapitre est présentée l'hyperguitare qui a été développée pour la pièce ainsi que la description de ses composantes matérielles et logicielles. Dans le troisième chapitre sont décrits le contexte de création, le rôle de l’improvisation dans le processus de création et la description technique des quatre oeuvres composées en amont de feedback. Le quatrième chapitre est dédié à la pièce feedback inspirée de l'œuvre cinématographique et littéraire Fight Club et imprégnée, sur le plan stylistique, de la culture rock, du free jazz et de la musique électroacoustique glitch. En conclusion sont présentés les nouveaux objectifs que ces recherches ont engendrés.
Resumo:
La radiothérapie stéréotaxique corporelle (SBRT) est une technique couramment employée pour le traitement de tumeurs aux poumons lorsque la chirurgie n’est pas possible ou refusée par le patient. Une complication de l’utilisation de cette méthode provient du mouvement de la tumeur causé par la respiration. Dans ce contexte, la radiothérapie asservie à la respiration (RGRT) peut être bénéfique. Toutefois, la RGRT augmente le temps de traitement en raison de la plus petite proportion de temps pour laquelle le faisceau est actif. En utilisant un faisceau de photons sans filtre égalisateur (FFF), ce problème peut être compensé par le débit de dose plus élevé d’un faisceau FFF. Ce mémoire traite de la faisabilité d’employer la technique de RGRT en combinaison avec l’utilisation un faisceau FFF sur un accélérateur Synergy S (Elekta, Stockholm, Suède) avec une ceinture pneumatique, le Bellows Belt (Philips, Amsterdam, Pays-Bas), comme dispositif de suivi du signal respiratoire. Un Synergy S a été modifié afin de pouvoir livrer un faisceau 6 MV FFF. Des mesures de profils de dose et de rendements en profondeur ont été acquises en cuve à eau pour différentes tailles de champs. Ces mesures ont été utilisées pour créer un modèle du faisceau 6 MV FFF dans le système de planification de traitement Pinnacle3 de Philips. Les mesures ont été comparées au modèle à l’aide de l’analyse gamma avec un critère de 2%, 2 mm. Par la suite, cinq plans SBRT avec thérapie en arc par modulation volumétrique (VMAT) ont été créés avec le modèle 6 MV du Synergy S, avec et sans filtre. Une comparaison des paramètres dosimétriques a été réalisée entre les plans avec et sans filtre pour évaluer la qualité des plans FFF. Les résultats révèlent qu’il est possible de créer des plans SBRT VMAT avec le faisceau 6 MV FFF du Synergy S qui sont cliniquement acceptables (les crières du Radiation Therapy Oncology Group 0618 sont respectés). Aussi, une interface physique de RGRT a été mise au point pour remplir deux fonctions : lire le signal numérique de la ceinture pneumatique Bellows Belt et envoyer une commande d’irradiation binaire au linac. L’activation/désactivation du faisceau du linac se fait par l’entremise d’un relais électromécanique. L’interface comprend un circuit électronique imprimé fait maison qui fonctionne en tandem avec un Raspberry Pi. Un logiciel de RGRT a été développé pour opérer sur le Raspberry Pi. Celui-ci affiche le signal numérique du Bellows Belt et donne l’option de choisir les limites supérieure et inférieure de la fenêtre d’irradiation, de sorte que lorsque le signal de la ceinture se trouve entre ces limites, le faisceau est actif, et inversement lorsque le signal est hors de ces limites. Le logiciel envoie donc une commande d’irradiation au linac de manière automatique en fonction de l’amplitude du signal respiratoire. Finalement, la comparaison entre la livraison d’un traitement standard sans RGRT avec filtre par rapport à un autre plan standard sans RGRT sans filtre démontre que le temps de traitement en mode FFF est réduit en moyenne de 54.1% pour un arc. De la même manière, la comparaison entre la livraison d’un traitement standard sans RGRT avec filtre par rapport à un plan de RGRT (fenêtre d’irradiation de 75%) sans filtre montre que le temps de traitement de RGRT en mode FFF est réduit en moyenne de 27.3% par arc. Toutefois, il n’a pas été possible de livrer des traitements de RGRT avec une fenêtre de moins de 75%. Le linac ne supporte pas une fréquence d’arrêts élevée.
Resumo:
Software systems are progressively being deployed in many facets of human life. The implication of the failure of such systems, has an assorted impact on its customers. The fundamental aspect that supports a software system, is focus on quality. Reliability describes the ability of the system to function under specified environment for a specified period of time and is used to objectively measure the quality. Evaluation of reliability of a computing system involves computation of hardware and software reliability. Most of the earlier works were given focus on software reliability with no consideration for hardware parts or vice versa. However, a complete estimation of reliability of a computing system requires these two elements to be considered together, and thus demands a combined approach. The present work focuses on this and presents a model for evaluating the reliability of a computing system. The method involves identifying the failure data for hardware components, software components and building a model based on it, to predict the reliability. To develop such a model, focus is given to the systems based on Open Source Software, since there is an increasing trend towards its use and only a few studies were reported on the modeling and measurement of the reliability of such products. The present work includes a thorough study on the role of Free and Open Source Software, evaluation of reliability growth models, and is trying to present an integrated model for the prediction of reliability of a computational system. The developed model has been compared with existing models and its usefulness of is being discussed.
Resumo:
Los proyectos y productos comúnmente llamados Free and Open Source Software1 relacionados con la geomática están experimentando una evolución y actualización vertiginosa. A los “tradicionales” proyectos de servicios de mapas, bases de datos espaciales o clientes pesados, se les están uniendo un amplio conjunto de componentes como servicios de publicación, clientes ligeros, servicios de geoprocesamiento, movilidad, frameworks, …o nuevos estándares como GeoRSS, WMS Tiled, WPS,… Este artículo pretende efectuar una breve pausa para analizar el panorama actual del mundo del software libre, categorizando los proyectos y productos existentes en la actualidad, para valorar cada uno de ellos, analizando su situación actual, su trayectoria, su evolución futura y las interrelaciones existentes en el ecosistema de software libre SIG. Se analizará la situación y el catálogo disponible de proyectos/productos de servidores de datos espaciales, servidores OGC , publicación/clientes de mapas ligeros, aplicaciones de escritorio, clientes IDE, bibliotecas de desarrollo, herramientas de catálogo cliente y servidor, etc.. Se mostrará el ecosistema de proyectos, organizaciones y personas que colaboran con los principales productos, con sus interrelaciones entre sí, y los planes de futuro conocidos. El resultado esperado es mostrar al lector una imagen general (“big-picture”) que le permita posicionar sus necesidades con criterio dentro del panorama actual de las soluciones SIG basadas en software libre