999 resultados para 080204 Mathematical Software
Resumo:
Jatkuvasti lisääntyvä matkapuhelinten käyttäjien määrä, internetin kehittyminen yleiseksi tiedon ja viihteen lähteeksi on luonut tarpeen palvelulle liikkuvan työaseman liittämiseksi tietokoneverkkoihin. GPRS on uusi teknologia, joka tarjoaa olemassa olevia matka- puhelinverkkoja (esim. NMT ja GSM) nopeamman, tehokkaamman ja taloudellisemman liitynnän pakettidataverkkoihin, kuten internettiin ja intranetteihin. Tämän työn tavoitteena oli toteuttaa GPRS:n paketinohjausyksikön (Packet Control Unit, PCU) testauksessa tarvittavat viestintäajurit työasemaympristöön. Aidot matkapuhelinverkot ovat liian kalliita, eikä niistä saa tarvittavasti lokitulostuksia, jotta niitä voisi käyttää GPRS:n testauksessa ohjelmiston kehityksen alkuvaihessa. Tämän takia PCU-ohjelmiston testaus suoritetaan joustavammassa ja helpommin hallittavassa ympäristössä, joka ei aseta kovia reaaliaikavaatimuksia. Uusi toimintaympäristö ja yhteysmedia vaativat PCU:n ja muiden GPRS-verkon yksiköiden välisistä yhteyksistä huolehtivien ohjelman osien, viestintäajurien uuden toteutuksen. Tämän työn tuloksena syntyivät tarvittavien viestintäajurien työasemaversiot. Työssä tarkastellaan eri tiedonsiirtotapoja ja -protokollia testattavan ohjelmiston vaateiden, toteutetun ajurin ja testauksen kannalta. Työssä esitellään kunkin ajurin toteuttama rajapinta ja toteutuksen aste, eli mitkä toiminnot on toteutettu ja mitä on jätetty pois. Ajureiden rakenne ja toiminta selvitetään siltä osin, kuin se on oleellista ohjelman toiminnan kannalta.
Resumo:
In this paper we show how a nonlinear preprocessing of speech signal -with high noise- based on morphological filters improves the performance of robust algorithms for pitch tracking (RAPT). This result happens for a very simple morphological filter. More sophisticated ones could even improve such results. Mathematical morphology is widely used in image processing and has a great amount of applications. Almost all its formulations derived in the two-dimensional framework are easily reformulated to be adapted to one-dimensional context
Resumo:
Tutkielman tavoitteena on tutkia, mikä olisi parhaiten case-yritykselle sopiva menetelmä tulla tekemään kauppaa ulkomaan markkinoille. Kaikki yleiset kansainvälisille markkinoilletulomenetelmät esitetään ja niiden edut ja haitat tuodaan esille. Selvittäessä tehtävänantajayrityksen resurssit, odotukset ja vaatimukset todetaan, että yhteistyössä tehtävä markkinoilletulo on pätevin vaihtoehto. Tämän jälkeen valitaan parhaiten tarkoitukseen sopiva yritys ennalta valitusta yritysvaihtoehtojen ryhmästä ja testataan tämän yrityksen yhteistyösopivuus case-yrityksen kanssa. Yritysten välinen yhteistyösopivuus arvioidaan analysoimalla yritykset haastattelujen avulla ja tutkielmassa esitettyjen teorioiden avulla. Sopivuus todetaan hyväksi, kattaen 71 prosenttia analysoiduista kohdista. Kaksikymmentäyhdeksän prosenttia kohdista todetaan kohdiksi, joissa yritysten välinen yhteisymmärrys ei ole toimeksiantajayrityksen minimivaatimukset täyttävää. Näitä kohtia tullaan käyttämään suunnittelun pohjana kun suunnitellaan jatkoneuvotteluja yhteistyön käynnistämiseksi.
Resumo:
For more than a decade, researchers have been aware of the increased pace of small-firm internationalization and the greater effect of these rapidly growing small businesses on the wealth, international trade, and job-creation opportunities of countries. Due to the small size of the home market, Finnish companies have been generally considered highly interested in internationalization. One particular domain in which rapid internationalization has been considered feasible is the global software business, with its knowledge-intensive nature and high growth potential. However, over time the failure rate of small entrepreneurial firms has remained especially high in high-technology markets. One of the reasons for this seems to lie in the fact that these companies are often formed by people with a strong technological background but limited competences in other areas. Further, research on the marketing capabilities of rapidly internationalizing high-tech firms has been scarce thus far. In addition, while there is much research on the first years of operations of rapidly internationalizing companies, it is not well known what becomes of them later on. Therefore, there is a need for more investigation into the managerial mindset, competences and decision-making in these small companies, especially from the perspective of how they acquire and exploit market knowledge, and enhance their networking capabilities in order to promote international expansion. The present study focuses on market orientation in small software firms that internationalize their operations rapidly in global software markets. It builds on qualitative data to illustrate how these companies develop their market-oriented product-market strategies during the process of increasing international commitment. It also shows how they manage their network relationships in order to be able to offer better customer service and to thrive in the fierce global competition. The study was conducted in the empirical context of Finnish small software companies, and the main data consists of interviews with top managers in these businesses. The interviews were designed to cover a minimum period of five years of the company's international operations, thus offering a retrospective in-depth perspective on market orientation, internationalization and partnerships in the given context. One particular focus is on less successfully internationalized software companies, and the challenges they face when approaching international markets. This study makes a significant contribution to the literature on market orientation for several reasons. First, building on data from the software industry, it clarifies the existing theory in the context of rapid internationalization and network relationships. Secondly, it provides a good body of evidence on market orientation in both successfully and less successfully internationalized companies, and identifies the key related differences between the two company groups. Thirdly, it highlights the importance of inter-firm networks in the rapid internationalization of small software firms, providing companies with important market knowledge and, in some cases, management challenges. Fourthly, this investigation clarifies market orientation in the context of different software-product strategies, thus, combining the perspectives of market orientation in both manufacturing and services. In sum, the results of the study are significant for both small software firms and public-policy makers since they shed light on the market-oriented managerial mindset and the market-information gathering and sharing processes that are needed in successful rapid internationalization.
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
This thesis studies evaluation of software development practices through an error analysis. The work presents software development process, software testing, software errors, error classification and software process improvement methods. The practical part of the work presents results from the error analysis of one software process. It also gives improvement ideas for the project. It was noticed that the classification of the error data was inadequate in the project. Because of this it was impossible to use the error data effectively. With the error analysis we were able to show that there were deficiencies in design and analyzing phases, implementation phase and in testing phase. The work gives ideas for improving error classification and for software development practices.
Resumo:
Estudi per canviar en tants aspectes com sigui possible la plataforma actual de treball d'un diari, en aquest cas El Periódico de Catalunya, i migrar a software lliure. Alguns dels punts més importants seran no perdre productivitat i que l'adaptació dels usuaris als nous sistemes proposats sigui poc traumàtica i el més senzilla possible.
Resumo:
The advent of multiparametric MRI has made it possible to change the way in which prostate biopsy is done, allowing to direct biopsies to suspicious lesions rather than randomly. The subject of this review relates to a computer-assisted strategy, the MRI/US fusion software-based targeted biopsy, and to its performance compared to the other sampling methods. Different devices with different methods to register MR images to live TRUS are currently in use to allow software-based targeted biopsy. Main clinical indications of MRI/US fusion software-based targeted biopsy are re-biopsy in men with persistent suspicious of prostate cancer after first negative standard biopsy and the follow-up of patients under active surveillance. Some studies have compared MRI/US fusion software-based targeted versus standard biopsy. In men at risk with MRI-suspicious lesion, targeted biopsy consistently detects more men with clinically significant disease as compared to standard biopsy; some studies have also shown decreased detection of insignificant disease. Only two studies directly compared MRI/US fusion software-based targeted biopsy with MRI/US fusion visual targeted biopsy, and the diagnostic ability seems to be in favor of the software approach. To date, no study comparing software-based targeted biopsy against in-bore MRI biopsy is available. The new software-based targeted approach seems to have the characteristics to be added in the standard pathway for achieving accurate risk stratification. Once reproducibility and cost-effectiveness will be verified, the actual issue will be to determine whether MRI/TRUS fusion software-based targeted biopsy represents anadd-on test or a replacement to standard TRUS biopsy.
Resumo:
Currently there is a vogue for Agile Software Development methods and many software development organizations have already implemented or they are planning to implement agile methods. Objective of this thesis is to define how agile software development methods are implemented in a small organization. Agile methods covered in this thesis are Scrum and XP. From both methods the key practices are analysed and compared to waterfall method. This thesis also defines implementation strategy and actions how agile methods are implemented in a small organization. In practice organization must prepare well and all needed meters are defined before the implementation starts. In this work three different sample projects are introduced where agile methods were implemented. Experiences from these projects were encouraging although sample set of projects were too small to get trustworthy results.
Resumo:
OBJETIVO: Apresentar um software que permita uma análise detalhada da dinâmica da deglutição. MATERIAIS E MÉTODOS: Participaram deste estudo dez indivíduos após acidente vascular encefálico, sendo seis do gênero masculino, com idade média de 57,6 anos. Foi realizada videofluoroscopia da deglutição e as imagens foram digitalizadas em microcomputador, com posterior análise do tempo do trânsito faríngeo da deglutição, por meio de um cronômetro e do software. RESULTADOS: O tempo médio do trânsito faríngeo da deglutição apresentou-se diferente quando comparados os métodos utilizados (cronômetro e software). CONCLUSÃO: Este software é um instrumento de análise dos parâmetros tempo e velocidade da deglutição, propiciando melhor compreensão da dinâmica da deglutição, com reflexos tanto na abordagem clínica dos pacientes com disfagia como para fins de pesquisa científica.
Resumo:
OBJETIVO: Foi desenvolvido um software denominado QualIM® - Qualificação de Imagens Médicas para treinamento de profissionais na interpretação de exames digitais de mamografias utilizando ferramentas de manipulação de imagens, em monitores específicos, classificadas em BI-RADS®. MATERIAIS E MÉTODOS: O sistema, desenvolvido em Delphi 7, armazena as respostas da interpretação de imagens mamográficas durante o treinamento e compara aos dados inseridos denominados "padrão-ouro". O sistema contém imagens de computed radiography, direct radiography e digitalizadas. O software converte as imagens do computed radiography e direct radiography para o formato TIFF, mantendo as resoluções espacial e de contraste originais. Profissionais em treinamento manipulam o realce da imagem utilizando ferramentas de software (zoom, inversão, réguas digitais, outras). Dependendo da complexidade, são apresentadas até oito incidências mamográficas, seis imagens de ultra-som e duas de anatomopatológico. RESULTADOS: O treinamento iniciou em 2007 e atualmente faz parte do programa de residência em radiologia. O software compõe o texto, de forma automática, das informações inseridas pelo profissional, baseado nas categorias BI-RADS, e compara com a base de dados. CONCLUSÃO: O software QualIM é uma ferramenta digital de ensino que auxilia profissionais no reconhecimento de padrões visuais de uma imagem mamográfica, bem como na interpretação de exames mamográficos, utilizando a classificação BI-RADS.
Resumo:
OBJETIVO: Avaliar o impacto sobre o treinamento de residentes utilizando uma ferramenta computacional dedicada à avaliação do desempenho da leitura de imagens radiológicas convencionais e digitais. MATERIAIS E MÉTODOS: O treinamento foi realizado no Laboratório de Qualificação de Imagens Médicas (QualIM). Os residentes de radiologia efetuaram cerca de 1.000 leituras de um total de 60 imagens obtidas de um simulador estatístico (Alvim®) que apresenta fibras e microcalcificações de dimensões variadas. O desempenho dos residentes na detecção dessas estruturas foi avaliado por meio de parâmetros estatísticos. RESULTADOS: Os resultados da probabilidade de detectabilidade foram de 0,789 e 0,818 para os sistemas convencional e digital, respectivamente. As taxas de falso-positivos foram de 8% e 6% e os valores de verdadeiro-positivos, de 66% e 70%, respectivamente. O valor de kappa total foi 0,553 para as leituras em negatoscópio e 0,615 em monitor. A área sob a curva ROC foi de 0,716 para leitura em filme e 0,810 para monitor. CONCLUSÃO: O treinamento proposto mostrou ser efetivo e apresentou impacto positivo sobre o desempenho dos residentes, constituindo-se em interessante ferramenta pedagógica. Os resultados sugerem que o método de treinamento baseado na leitura de simuladores pode produzir um melhor desempenho dos profissionais na interpretação das imagens mamográficas.
Resumo:
La tomodensitométrie (TDM) est une technique d'imagerie pour laquelle l'intérêt n'a cessé de croitre depuis son apparition au début des années 70. De nos jours, l'utilisation de cette technique est devenue incontournable, grâce entre autres à sa capacité à produire des images diagnostiques de haute qualité. Toutefois, et en dépit d'un bénéfice indiscutable sur la prise en charge des patients, l'augmentation importante du nombre d'examens TDM pratiqués soulève des questions sur l'effet potentiellement dangereux des rayonnements ionisants sur la population. Parmi ces effets néfastes, l'induction de cancers liés à l'exposition aux rayonnements ionisants reste l'un des risques majeurs. Afin que le rapport bénéfice-risques reste favorable au patient il est donc nécessaire de s'assurer que la dose délivrée permette de formuler le bon diagnostic tout en évitant d'avoir recours à des images dont la qualité est inutilement élevée. Ce processus d'optimisation, qui est une préoccupation importante pour les patients adultes, doit même devenir une priorité lorsque l'on examine des enfants ou des adolescents, en particulier lors d'études de suivi requérant plusieurs examens tout au long de leur vie. Enfants et jeunes adultes sont en effet beaucoup plus sensibles aux radiations du fait de leur métabolisme plus rapide que celui des adultes. De plus, les probabilités des évènements auxquels ils s'exposent sont également plus grandes du fait de leur plus longue espérance de vie. L'introduction des algorithmes de reconstruction itératifs, conçus pour réduire l'exposition des patients, est certainement l'une des plus grandes avancées en TDM, mais elle s'accompagne de certaines difficultés en ce qui concerne l'évaluation de la qualité des images produites. Le but de ce travail est de mettre en place une stratégie pour investiguer le potentiel des algorithmes itératifs vis-à-vis de la réduction de dose sans pour autant compromettre la qualité du diagnostic. La difficulté de cette tâche réside principalement dans le fait de disposer d'une méthode visant à évaluer la qualité d'image de façon pertinente d'un point de vue clinique. La première étape a consisté à caractériser la qualité d'image lors d'examen musculo-squelettique. Ce travail a été réalisé en étroite collaboration avec des radiologues pour s'assurer un choix pertinent de critères de qualité d'image. Une attention particulière a été portée au bruit et à la résolution des images reconstruites à l'aide d'algorithmes itératifs. L'analyse de ces paramètres a permis aux radiologues d'adapter leurs protocoles grâce à une possible estimation de la perte de qualité d'image liée à la réduction de dose. Notre travail nous a également permis d'investiguer la diminution de la détectabilité à bas contraste associée à une diminution de la dose ; difficulté majeure lorsque l'on pratique un examen dans la région abdominale. Sachant que des alternatives à la façon standard de caractériser la qualité d'image (métriques de l'espace Fourier) devaient être utilisées, nous nous sommes appuyés sur l'utilisation de modèles d'observateurs mathématiques. Nos paramètres expérimentaux ont ensuite permis de déterminer le type de modèle à utiliser. Les modèles idéaux ont été utilisés pour caractériser la qualité d'image lorsque des paramètres purement physiques concernant la détectabilité du signal devaient être estimés alors que les modèles anthropomorphes ont été utilisés dans des contextes cliniques où les résultats devaient être comparés à ceux d'observateurs humain, tirant profit des propriétés de ce type de modèles. Cette étude a confirmé que l'utilisation de modèles d'observateurs permettait d'évaluer la qualité d'image en utilisant une approche basée sur la tâche à effectuer, permettant ainsi d'établir un lien entre les physiciens médicaux et les radiologues. Nous avons également montré que les reconstructions itératives ont le potentiel de réduire la dose sans altérer la qualité du diagnostic. Parmi les différentes reconstructions itératives, celles de type « model-based » sont celles qui offrent le plus grand potentiel d'optimisation, puisque les images produites grâce à cette modalité conduisent à un diagnostic exact même lors d'acquisitions à très basse dose. Ce travail a également permis de clarifier le rôle du physicien médical en TDM: Les métriques standards restent utiles pour évaluer la conformité d'un appareil aux requis légaux, mais l'utilisation de modèles d'observateurs est inévitable pour optimiser les protocoles d'imagerie. -- Computed tomography (CT) is an imaging technique in which interest has been quickly growing since it began to be used in the 1970s. Today, it has become an extensively used modality because of its ability to produce accurate diagnostic images. However, even if a direct benefit to patient healthcare is attributed to CT, the dramatic increase in the number of CT examinations performed has raised concerns about the potential negative effects of ionising radiation on the population. Among those negative effects, one of the major risks remaining is the development of cancers associated with exposure to diagnostic X-ray procedures. In order to ensure that the benefits-risk ratio still remains in favour of the patient, it is necessary to make sure that the delivered dose leads to the proper diagnosis without producing unnecessarily high-quality images. This optimisation scheme is already an important concern for adult patients, but it must become an even greater priority when examinations are performed on children or young adults, in particular with follow-up studies which require several CT procedures over the patient's life. Indeed, children and young adults are more sensitive to radiation due to their faster metabolism. In addition, harmful consequences have a higher probability to occur because of a younger patient's longer life expectancy. The recent introduction of iterative reconstruction algorithms, which were designed to substantially reduce dose, is certainly a major achievement in CT evolution, but it has also created difficulties in the quality assessment of the images produced using those algorithms. The goal of the present work was to propose a strategy to investigate the potential of iterative reconstructions to reduce dose without compromising the ability to answer the diagnostic questions. The major difficulty entails disposing a clinically relevant way to estimate image quality. To ensure the choice of pertinent image quality criteria this work was continuously performed in close collaboration with radiologists. The work began by tackling the way to characterise image quality when dealing with musculo-skeletal examinations. We focused, in particular, on image noise and spatial resolution behaviours when iterative image reconstruction was used. The analyses of the physical parameters allowed radiologists to adapt their image acquisition and reconstruction protocols while knowing what loss of image quality to expect. This work also dealt with the loss of low-contrast detectability associated with dose reduction, something which is a major concern when dealing with patient dose reduction in abdominal investigations. Knowing that alternative ways had to be used to assess image quality rather than classical Fourier-space metrics, we focused on the use of mathematical model observers. Our experimental parameters determined the type of model to use. Ideal model observers were applied to characterise image quality when purely objective results about the signal detectability were researched, whereas anthropomorphic model observers were used in a more clinical context, when the results had to be compared with the eye of a radiologist thus taking advantage of their incorporation of human visual system elements. This work confirmed that the use of model observers makes it possible to assess image quality using a task-based approach, which, in turn, establishes a bridge between medical physicists and radiologists. It also demonstrated that statistical iterative reconstructions have the potential to reduce the delivered dose without impairing the quality of the diagnosis. Among the different types of iterative reconstructions, model-based ones offer the greatest potential, since images produced using this modality can still lead to an accurate diagnosis even when acquired at very low dose. This work has clarified the role of medical physicists when dealing with CT imaging. The use of the standard metrics used in the field of CT imaging remains quite important when dealing with the assessment of unit compliance to legal requirements, but the use of a model observer is the way to go when dealing with the optimisation of the imaging protocols.