964 resultados para Libraries -- Computer programs
Resumo:
Background: The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLS http://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease"s rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). Results: We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Conclusions: Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Java™ 2 Platform, Micro Edition on eräs johtava sovellusalusta, joka mahdollistaa kolmannen osapuolen sovellusten luomisen matkapuhelimiin, kommunikaattoreihin ja taskutietokoneisiin. Java-alusta keskeinen etu on sovellusten dynaaminen asentaminen. Käyttäjä ei ole rajoitettu esiasennettuihin sovelluksiin vaan voi asentaa niitä itse tarpeen mukaan. Tämän diplomityö käsittelee erilaisia Java sovellusten (MIDlettien) lataus ja asennusmenetelmiä. Diplomityö antaa yhteenvedon merkittävimmistä asennus teknologioista. Pääpaino on MIDP-standardin mukaisella langattomalle asennuksella (Over-The-Air provisioning) sillä se on kaikkein laajimmin käytetty menetelmä. Muita käsiteltäviä menetelmiä ovat WAP Push ja paikallinen asennus Bluetoothin ja Infrapunalinkin avulla. MIDletit, kuten mitkä tahansa ohjelmat, ovat alttiita laittomalle kopioinnille. Tämä diplomityö kuvaa menetelmiä, joilla laiton kopiointi voidaan estää. Yksi esimerkki on OMA™ DRM standardi. Diplomityö kuvaa myös kuinka kopiointisuojaus voidaan yhdistää olemassa oleviin asennusmenetelmiin. Java sovelluksia, MIDlettejä, käytetään yhä erilaisimpiin tarkoituksiin jolloin tarvitaan myös uusia asennusmenetelmiä. Yksi tällainen menetelmä on asentaminen erillisistä laitteista. Diplomityö kuvaa useita menetelmiä asentamiseen erillisistä laitteista. Käsitellyr menetelmät pohjautuvat Bluetooth teknologiaan ja yhtä lukuun ottamatta perustuvat standardin määrittelemiin Bluetooth profiileihin File Transfer Profile, Personal Area Networking Profile ja Object Push Profile. Toinen asennustapa on sovellusten edelleen lähettäminen toiseen puhelimeen. Diplomityö kuvaa kuinka OMA DRM standardi voidaan yhdistää tällaisen asennuksen ja ehdottaa kahta vaihtoehtoista menetelmää. Yksi perustuu Bluetoothin Object Push Profiiliin ja toinen Infrapunalinkin käyttöön. Toinen perustuu multimediaviestiin ja sähköpostiin.
Resumo:
Tutkielmassa analysoitiin yhteensä 73:n teknisen analyysin menetelmävariaation ja samalta ajanjaksolta lasketun osta ja pidä -strategian tuottojen eroja aineistolla, joka koostui 43 Helsingin Arvopaperipörssin päälistalla vuodesta 1991 vuoteen 1998 noteeratun yhtiön osakkeiden päivän päätöskursseista. Empiiriset testit toteutettiin tutkielmaa varten laadituilla Pascal-ohjelmilla, joilla simuloitiin eri teknisen analyysin menetelmien mukaista päivittäistä kaupankäyntiä. Tulokset osoittivat, ettei teknisen analyysin menetelmien avulla olisi tarkasteluperiodilla päässyt osta ja pidä -strategian tuottotasolle, sillä ainoastaan yksi strategioista ylitti osta ja pidä -strategian tuottotason. Negatiivinen korrelaatio kunkin teknisen analyysin menetelmän tuottamien kauppojen lukumäärän ja strategian kannattavuuden välillä oli erittäin vahva; mitä suurempi signaaliherkkyys, sitä heikompi oli kyseisen strategian tulos. Tutkimustulokset tukivat siten markkinatehokkuuden heikkojen ehtojen hypoteesia, jonka mukaan mennyt hintainformaatio ei ole monetäärisesti hyödynnettävissä.
Resumo:
This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.
Resumo:
Tietokoneohjelmaa suojataan tekijänoikeudella, liikesalaisuussuojalla ja patentilla. Jotta ohjelmistoalan yritys pärjäisi dynaamisilla ja kansainvälisillä ohjelmistomarkkinoilla sen pitää patentoida ohjelmansa sekä hyödyntää ja puolustaa patenttejaan. Ohjelmistopatentteja myönnetään myös Euroopassa yhä enemmän. Ohjelmistoteollisuudessa tuotekehitys perustuu usein jo olemassa olevalle, josta aiheutuu alalle tyypillistä teknologioiden päällekkäisyyttä. Jotta yritys pystyisi toimimaan tietyllä markkina-alueella, se saattaa tarvita sellaista teknologiaa joka on jo jonkun patentoimaa. Edellä mainituista syistä sekä ohjelmistopatenttien samanlaisuuksista ja patenttien suuresta määrästä johtuen patentinloukkauksia tapahtuu ja niihin tulee reagoida liikesuhteet huomioon ottaen, esimerkiksi neuvottelemalla liiketoimintasopimuksesta, sovittelemalla konfliktia sovittelumenettelyssä ja tarvittaessa oikeudellisin keinoin.
Resumo:
In this paper we describe three computer programs in Basic language about the Fourier transform (FFT) which are available in the Internet site http://artemis.ffclrp.usp.br/SoftwareE.htm (in English) or http://artemis.ffclrp.usp.br/softwareP.htm (in Portuguese) since October 1998. Those are addresses to the Web Page of our Laboratory of Organic Synthesis. The programs can be downloaded and used by anyone who is interested on the subject. The texts, menus and captions in the programs are written in English.
Resumo:
Neural Networks are a set of mathematical methods and computer programs designed to simulate the information process and the knowledge acquisition of the human brain. In last years its application in chemistry is increasing significantly, due the special characteristics for model complex systems. The basic principles of two types of neural networks, the multi-layer perceptrons and radial basis functions, are introduced, as well as, a pruning approach to architecture optimization. Two analytical applications based on near infrared spectroscopy are presented, the first one for determination of nitrogen content in wheat leaves using multi-layer perceptrons networks and second one for determination of BRIX in sugar cane juices using radial basis functions networks.
Resumo:
Molecular Modeling is an important tool in drug design and it is very useful to predict biological activity from a library of compounds. A wide variety of computer programs and methods have been developed to visualize the tridimensional geometry and calculate physical properties of drugs. In this work, we describe a practical approach of molecular modeling as a powerful tool to study structure-activity relationships of drugs, including some antibacterials, hormones, cholinergic and adrenergic agents. At first, the students learn how to draw 3D structures and use them to perform conformational and molecular analysis. Thus, they compare drugs with similar pharmacological activity by superimposing one structure on the top of another and evaluate the geometry and physical properties.
Resumo:
For some years, Chemistry teachers have used scientific visualization software of molecular models in computing rooms and chemistry laboratories for educational purposes. However, its application in classrooms has been limited. This article describes the integration and use of computer programs for scientific molecular visualization in a traditional classroom. We consider that the improvement of technical aspects of their application and use (usability) has a direct effect on students' understanding of molecular structures (including students' extrinsic motivation), among other factors. Consequently, we developed a guide for the integration of hardware and software of molecular visualization for its use in the classroom.
Resumo:
We report a didactic experience in teaching Pearson's theory (HSAB) to graduate students in organic chemistry. This approach was based on teaching students how to use computer programs to calculate frontier orbitals (HOMO-LUMO). The suggested level of calculation was a semi-empiric PM3, proving to be efficient for obtaining robust and fast numerical results that can be performed easily in the classroom. We described a practical computational exercise and asked students to compare these numerical data with qualitative analysis using valence bond theory. A comprehensive solution of this exercise is presented, aiming to support teachers in their lessons.
Resumo:
The computer is a useful tool in the teaching of upper secondary school physics, and should not have a subordinate role in students' learning process. However, computers and computer-based tools are often not available when they could serve their purpose best in the ongoing teaching. Another problem is the fact that commercially available tools are not usable in the way the teacher wants. The aim of this thesis was to try out a novel teaching scenario in a complicated subject in physics, electrodynamics. The didactic engineering of the thesis consisted of developing a computer-based simulation and training material, implementing the tool in physics teaching and investigating its effectiveness in the learning process. The design-based research method, didactic engineering (Artigue, 1994), which is based on the theoryof didactical situations (Brousseau, 1997), was used as a frame of reference for the design of this type of teaching product. In designing the simulation tool a general spreadsheet program was used. The design was based on parallel, dynamic representations of the physics behind the function of an AC series circuit in both graphical and numerical form. The tool, which was furnished with possibilities to control the representations in an interactive way, was hypothesized to activate the students and promote the effectiveness of their learning. An effect variable was constructed in order to measure the students' and teachers' conceptions of learning effectiveness. The empirical study was twofold. Twelve physics students, who attended a course in electrodynamics in an upper secondary school, participated in a class experiment with the computer-based tool implemented in three modes of didactical situations: practice, concept introduction and assessment. The main goal of the didactical situations was to have students solve problems and study the function of AC series circuits, taking responsibility for theirown learning process. In the teacher study eighteen Swedish speaking physics teachers evaluated the didactic potential of the computer-based tool and the accompanying paper-based material without using them in their physics teaching. Quantitative and qualitative data were collected using questionnaires, observations and interviews. The result of the studies showed that both the group of students and the teachers had generally positive conceptions of learning effectiveness. The students' conceptions were more positive in the practice situation than in the concept introduction situation, a setting that was more explorative. However, it turned out that the students' conceptions were also positive in the more complex assessment situation. This had not been hypothesized. A deeper analysis of data from observations and interviews showed that one of the students in each pair was more active than the other, taking more initiative and more responsibilityfor the student-student and student-computer interaction. These active studentshad strong, positive conceptions of learning effectiveness in each of the threedidactical situations. The group of less active students had a weak but positive conception in the first iv two situations, but a negative conception in the assessment situation, thus corroborating the hypothesis ad hoc. The teacher study revealed that computers were seldom used in physics teaching and that computer programs were in short supply. The use of a computer was considered time-consuming. As long as physics teaching with computer-based tools has to take place in special computer rooms, the use of such tools will remain limited. The affordance is enhanced when the physical dimensions as well as the performance of the computer are optimised. As a consequence, the computer then becomes a real learning tool for each pair of students, smoothly integrated into the ongoing teaching in the same space where teaching normally takes place. With more interactive support from the teacher, the computer-based parallel, dynamic representations will be efficient in promoting the learning process of the students with focus on qualitative reasoning - an often neglected part of the learning process of the students in upper secondary school physics.
Resumo:
BACKGROUND: Simulation techniques are spreading rapidly in medicine. Suc h resources are increasingly concentrated in Simulation Laboratories. The MSRP-USP is structuring such a laboratory and is interested in the prevalence of individual initiatives that could be centralized there. The MSRP-USP currently has five full-curriculum courses in the health sciences: Medicine, Speech Therapy, Physical Therapy, Nutrition, and Occupational Therapy, all consisting of core disciplines. GOAL: To determine the prevalence of simulation techniques in the regular courses at MSRP-USP. METHODS: Coordinators of disciplines in the various courses were interviewed using a specifically designed semi-structured questionnaire, and all the collected data were stored in a dedicated database. The disciplines were grouped according to whether they used (GI) or did not use (GII) simulation resources. RESULTS AND DISCUSSION: 256 disciplines were analyzed, of which only 18.3% used simulation techniques, varying according to course: Medicine (24.7.3%), Occupational Therapy (23.0%), Nutrition (15.9%), Physical Therapy (9.8%), and Speech Therapy (9.1%). Computer simulation programs predominated (42.5%) in all five courses. The resources were provided mainly by MSRP-USP (56.3%), with additional funding coming from other sources based on individual initiatives. The same pattern was observed for maintenance. There was great interest in centralizing the resources in the new Simulation Laboratory in order to facilitate maintenance, but there was concern about training and access to the material. CONCLUSIONS: 1) The MSRP-USP simulation resources show low complexity and are mainly limited to computer programs; 2) Use of simulation varies according to course, and is most prevalent in Medicine; 3) Resources are scattered across several locations, and their acquisition and maintenance depend on individual initiatives rather than central coordination or curricular guidelines
Resumo:
Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.
Resumo:
Tutkimuksen tavoitteena oli tutkia KIBS-yrityksen diskurssien vaikutusta kyseisen yrityksen palveluinnovaatioprosessin johtamiseen. Kontekstina tässä työssä oli tilitoimisto, jota verrattiin tutkimuksen muihin KIBS-yrityksiin. Tutkimusmenetelmänä käytettiin haastattelua ja analyysikeinona diskurssianalyysiä. Tutkimuksen tulokset osoittavat, että tilitoimiston diskursseissa luotetaan vahvasti tietotekniikan ja koulutuksen voimaan palveluinnovaatioiden kehittämisessä. Innovaatiojohtamisessa tämä voi merkitä näkökentän kaventumista. Tilitoimistossa havaittuja innovaation kannustimia olivat esimerkiksi työn itsenäisyys. Tilitoimistossa palveluinnovaatioita esti liiallinen kiire. Tilitoimiston diskurssien kautta ilmenevä palveluinnovaatioprosessin johtaminen oli melko kehittymätöntä verrattuna tutkimuksen innovatiivisimpaan yritykseen, joka suunnitteli taloteknisiä ohjelmistoja.