960 resultados para Foundations Computer programs
Resumo:
Over 70% of the total costs of an end product are consequences of decisions that are made during the design process. A search for optimal cross-sections will often have only a marginal effect on the amount of material used if the geometry of a structure is fixed and if the cross-sectional characteristics of its elements are property designed by conventional methods. In recent years, optimalgeometry has become a central area of research in the automated design of structures. It is generally accepted that no single optimisation algorithm is suitable for all engineering design problems. An appropriate algorithm, therefore, mustbe selected individually for each optimisation situation. Modelling is the mosttime consuming phase in the optimisation of steel and metal structures. In thisresearch, the goal was to develop a method and computer program, which reduces the modelling and optimisation time for structural design. The program needed anoptimisation algorithm that is suitable for various engineering design problems. Because Finite Element modelling is commonly used in the design of steel and metal structures, the interaction between a finite element tool and optimisation tool needed a practical solution. The developed method and computer programs were tested with standard optimisation tests and practical design optimisation cases. Three generations of computer programs are developed. The programs combine anoptimisation problem modelling tool and FE-modelling program using three alternate methdos. The modelling and optimisation was demonstrated in the design of a new boom construction and steel structures of flat and ridge roofs. This thesis demonstrates that the most time consuming modelling time is significantly reduced. Modelling errors are reduced and the results are more reliable. A new selection rule for the evolution algorithm, which eliminates the need for constraint weight factors is tested with optimisation cases of the steel structures that include hundreds of constraints. It is seen that the tested algorithm can be used nearly as a black box without parameter settings and penalty factors of the constraints.
Resumo:
L'article presenta les característiques i funcionalitats principals de les eines MetaLib i SFX, programaris per a la gestió i accés als recursos electrònics. S'analitzen els programaris citant els diferents tipus de funcionalitats, i la gestió i el manteniment. El procés s'emmarca en l'experiència de la Universitat Oberta de Catalunya, membre del CBUC, i se n'expliquen la configuració, els reptes i les dificultats que es van produir durant la implementació dels sistemes.
Resumo:
L'article presenta les característiques i funcionalitats principals de les eines MetaLib i SFX, programaris per a la gestió i accés als recursos electrònics. S'analitzen els programaris citant els diferents tipus de funcionalitats, i la gestió i el manteniment. El procés s'emmarca en l'experiència de la Universitat Oberta de Catalunya, membre del CBUC, i se n'expliquen la configuració, els reptes i les dificultats que es van produir durant la implementació dels sistemes.
Resumo:
L'article presenta les característiques i funcionalitats principals de les eines MetaLib i SFX, programaris per a la gestió i accés als recursos electrònics. S'analitzen els programaris citant els diferents tipus de funcionalitats, i la gestió i el manteniment. El procés s'emmarca en l'experiència de la Universitat Oberta de Catalunya, membre del CBUC, i se n'expliquen la configuració, els reptes i les dificultats que es van produir durant la implementació dels sistemes.
Resumo:
The implementation of Metalib in UOC's Library was framed within a project centred on the end user. For the adaptation of Metalib to the users' needs and search behaviour, we carried out three tests: - One with students, conducted by a consulting company; - A second one with teachers, carried out by the Library; - The third, with teachers and staff, also was carried out by the Library, after having made some changes resulting from the first and second tests. The tests, made with the software Morae, underlined a number of difficulties to understand the concepts related to metasearching. The tests also showed difficulties in navigating and in managing search results. All the information gathered was used to improve navigation through the interface. The presentation tells about the tests' methodology, the conclusions reached through this experience, and the decisions finally taken by the Library to improve the Metalib interface.
Resumo:
This paper presents a reflection on the need for libraries to think about how to facilitate access to the documentary sources they manage.As the number of resources available in electronic form increases, libraries are in the need to provide a simple and usable search tool that allows integrating the contents of the various information management systems they give access to.To define user expectations to the search interface, some of the features that they are accustomed to use in their requests for information on the Internet have been included.The technologies that allow the discovery layer implementation as a search tool that integrates the various information systems of the library are presented next. And below are some examples of implementations that work in line with the integration of various information sources into a single search engine, as models to consider for implementing a system of this kind.The purpose of it all is to present a state of the art of some cases of operational deployments as a starting point for any organization interested in improving access it offers to its resources on the basis of references study.
Resumo:
La present memòria es centra en l'anàlisi comparatiu de les diferents eines de software per a la gestió de projectes dins del mercat.
Resumo:
Background: The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLS http://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease"s rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). Results: We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Conclusions: Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
Java™ 2 Platform, Micro Edition on eräs johtava sovellusalusta, joka mahdollistaa kolmannen osapuolen sovellusten luomisen matkapuhelimiin, kommunikaattoreihin ja taskutietokoneisiin. Java-alusta keskeinen etu on sovellusten dynaaminen asentaminen. Käyttäjä ei ole rajoitettu esiasennettuihin sovelluksiin vaan voi asentaa niitä itse tarpeen mukaan. Tämän diplomityö käsittelee erilaisia Java sovellusten (MIDlettien) lataus ja asennusmenetelmiä. Diplomityö antaa yhteenvedon merkittävimmistä asennus teknologioista. Pääpaino on MIDP-standardin mukaisella langattomalle asennuksella (Over-The-Air provisioning) sillä se on kaikkein laajimmin käytetty menetelmä. Muita käsiteltäviä menetelmiä ovat WAP Push ja paikallinen asennus Bluetoothin ja Infrapunalinkin avulla. MIDletit, kuten mitkä tahansa ohjelmat, ovat alttiita laittomalle kopioinnille. Tämä diplomityö kuvaa menetelmiä, joilla laiton kopiointi voidaan estää. Yksi esimerkki on OMA™ DRM standardi. Diplomityö kuvaa myös kuinka kopiointisuojaus voidaan yhdistää olemassa oleviin asennusmenetelmiin. Java sovelluksia, MIDlettejä, käytetään yhä erilaisimpiin tarkoituksiin jolloin tarvitaan myös uusia asennusmenetelmiä. Yksi tällainen menetelmä on asentaminen erillisistä laitteista. Diplomityö kuvaa useita menetelmiä asentamiseen erillisistä laitteista. Käsitellyr menetelmät pohjautuvat Bluetooth teknologiaan ja yhtä lukuun ottamatta perustuvat standardin määrittelemiin Bluetooth profiileihin File Transfer Profile, Personal Area Networking Profile ja Object Push Profile. Toinen asennustapa on sovellusten edelleen lähettäminen toiseen puhelimeen. Diplomityö kuvaa kuinka OMA DRM standardi voidaan yhdistää tällaisen asennuksen ja ehdottaa kahta vaihtoehtoista menetelmää. Yksi perustuu Bluetoothin Object Push Profiiliin ja toinen Infrapunalinkin käyttöön. Toinen perustuu multimediaviestiin ja sähköpostiin.
Resumo:
Tutkielmassa analysoitiin yhteensä 73:n teknisen analyysin menetelmävariaation ja samalta ajanjaksolta lasketun osta ja pidä -strategian tuottojen eroja aineistolla, joka koostui 43 Helsingin Arvopaperipörssin päälistalla vuodesta 1991 vuoteen 1998 noteeratun yhtiön osakkeiden päivän päätöskursseista. Empiiriset testit toteutettiin tutkielmaa varten laadituilla Pascal-ohjelmilla, joilla simuloitiin eri teknisen analyysin menetelmien mukaista päivittäistä kaupankäyntiä. Tulokset osoittivat, ettei teknisen analyysin menetelmien avulla olisi tarkasteluperiodilla päässyt osta ja pidä -strategian tuottotasolle, sillä ainoastaan yksi strategioista ylitti osta ja pidä -strategian tuottotason. Negatiivinen korrelaatio kunkin teknisen analyysin menetelmän tuottamien kauppojen lukumäärän ja strategian kannattavuuden välillä oli erittäin vahva; mitä suurempi signaaliherkkyys, sitä heikompi oli kyseisen strategian tulos. Tutkimustulokset tukivat siten markkinatehokkuuden heikkojen ehtojen hypoteesia, jonka mukaan mennyt hintainformaatio ei ole monetäärisesti hyödynnettävissä.
Resumo:
This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.
Resumo:
Tietokoneohjelmaa suojataan tekijänoikeudella, liikesalaisuussuojalla ja patentilla. Jotta ohjelmistoalan yritys pärjäisi dynaamisilla ja kansainvälisillä ohjelmistomarkkinoilla sen pitää patentoida ohjelmansa sekä hyödyntää ja puolustaa patenttejaan. Ohjelmistopatentteja myönnetään myös Euroopassa yhä enemmän. Ohjelmistoteollisuudessa tuotekehitys perustuu usein jo olemassa olevalle, josta aiheutuu alalle tyypillistä teknologioiden päällekkäisyyttä. Jotta yritys pystyisi toimimaan tietyllä markkina-alueella, se saattaa tarvita sellaista teknologiaa joka on jo jonkun patentoimaa. Edellä mainituista syistä sekä ohjelmistopatenttien samanlaisuuksista ja patenttien suuresta määrästä johtuen patentinloukkauksia tapahtuu ja niihin tulee reagoida liikesuhteet huomioon ottaen, esimerkiksi neuvottelemalla liiketoimintasopimuksesta, sovittelemalla konfliktia sovittelumenettelyssä ja tarvittaessa oikeudellisin keinoin.
Resumo:
In this paper we describe three computer programs in Basic language about the Fourier transform (FFT) which are available in the Internet site http://artemis.ffclrp.usp.br/SoftwareE.htm (in English) or http://artemis.ffclrp.usp.br/softwareP.htm (in Portuguese) since October 1998. Those are addresses to the Web Page of our Laboratory of Organic Synthesis. The programs can be downloaded and used by anyone who is interested on the subject. The texts, menus and captions in the programs are written in English.
Resumo:
Neural Networks are a set of mathematical methods and computer programs designed to simulate the information process and the knowledge acquisition of the human brain. In last years its application in chemistry is increasing significantly, due the special characteristics for model complex systems. The basic principles of two types of neural networks, the multi-layer perceptrons and radial basis functions, are introduced, as well as, a pruning approach to architecture optimization. Two analytical applications based on near infrared spectroscopy are presented, the first one for determination of nitrogen content in wheat leaves using multi-layer perceptrons networks and second one for determination of BRIX in sugar cane juices using radial basis functions networks.