974 resultados para Software packages selection
Resumo:
Three pavement design software packages were compared with regards to how they were different in determining design input parameters and their influences on the pavement thickness. StreetPave designs the concrete pavement thickness based on the PCA method and the equivalent asphalt pavement thickness. The WinPAS software performs both concrete and asphalt pavements following the AASHTO 1993 design method. The APAI software designs asphalt pavements based on pre-mechanistic/empirical AASHTO methodology. First, the following four critical design input parameters were identified: traffic, subgrade strength, reliability, and design life. The sensitivity analysis of these four design input parameters were performed using three pavement design software packages to identify which input parameters require the most attention during pavement design. Based on the current pavement design procedures and sensitivity analysis results, a prototype pavement design and sensitivity analysis (PD&SA) software package was developed to retrieve the pavement thickness design value for a given condition and allow a user to perform a pavement design sensitivity analysis. The prototype PD&SA software is a computer program that stores pavement design results in database that is designed for the user to input design data from the variety of design programs and query design results for given conditions. The prototype Pavement Design and Sensitivity Analysis (PA&SA) software package was developed to demonstrate the concept of retrieving the pavement design results from the database for a design sensitivity analysis. This final report does not include the prototype software which will be validated and tested during the next phase.
Resumo:
OBJECTIVES: The purpose of this study was to compare myocardial blood flow (MBF) and myocardial flow reserve (MFR) estimates from rubidium-82 positron emission tomography ((82)Rb PET) data using 10 software packages (SPs) based on 8 tracer kinetic models. BACKGROUND: It is unknown how MBF and MFR values from existing SPs agree for (82)Rb PET. METHODS: Rest and stress (82)Rb PET scans of 48 patients with suspected or known coronary artery disease were analyzed in 10 centers. Each center used 1 of 10 SPs to analyze global and regional MBF using the different kinetic models implemented. Values were considered to agree if they simultaneously had an intraclass correlation coefficient >0.75 and a difference <20% of the median across all programs. RESULTS: The most common model evaluated was the Ottawa Heart Institute 1-tissue compartment model (OHI-1-TCM). MBF values from 7 of 8 SPs implementing this model agreed best. Values from 2 other models (alternative 1-TCM and Axially distributed) also agreed well, with occasional differences. The MBF results from other models (e.g., 2-TCM and retention) were less in agreement with values from OHI-1-TCM. CONCLUSIONS: SPs using the most common kinetic model-OHI-1-TCM-provided consistent results in measuring global and regional MBF values, suggesting that they may be used interchangeably to process data acquired with a common imaging protocol.
Resumo:
Background: The repertoire of statistical methods dealing with the descriptive analysis of the burden of a disease has been expanded and implemented in statistical software packages during the last years. The purpose of this paper is to present a web-based tool, REGSTATTOOLS http://regstattools.net intended to provide analysis for the burden of cancer, or other group of disease registry data. Three software applications are included in REGSTATTOOLS: SART (analysis of disease"s rates and its time trends), RiskDiff (analysis of percent changes in the rates due to demographic factors and risk of developing or dying from a disease) and WAERS (relative survival analysis). Results: We show a real-data application through the assessment of the burden of tobacco-related cancer incidence in two Spanish regions in the period 1995-2004. Making use of SART we show that lung cancer is the most common cancer among those cancers, with rising trends in incidence among women. We compared 2000-2004 data with that of 1995-1999 to assess percent changes in the number of cases as well as relative survival using RiskDiff and WAERS, respectively. We show that the net change increase in lung cancer cases among women was mainly attributable to an increased risk of developing lung cancer, whereas in men it is attributable to the increase in population size. Among men, lung cancer relative survival was higher in 2000-2004 than in 1995-1999, whereas it was similar among women when these time periods were compared. Conclusions: Unlike other similar applications, REGSTATTOOLS does not require local software installation and it is simple to use, fast and easy to interpret. It is a set of web-based statistical tools intended for automated calculation of population indicators that any professional in health or social sciences may require.
Resumo:
The aim of this work is to study flow properties at T-junction of pipe, pressure loss suffered by the flow after passing through T-junction and to study reliability of the classical engineering formulas used to find head loss for T-junction of pipes. In this we have compared our results with CFD software packages with classical formula and made an attempt to determine accuracy of the classical formulas. In this work we have studies head loss in T-junction of pipes with various inlet velocities, head loss in T-junction of pipes when the angle of the junction is slightly different from 90 degrees and T-junction with different area of cross-section of the main pipe and branch pipe. In this work we have simulated the flow at T-junction of pipe with FLUENT and Comsol Multiphysics and observed flow properties inside the T-junction and studied the head loss suffered by fluid flow after passing through the junction. We have also compared pressure (head) losses obtained by classical formulas by A. Vazsonyi and Andrew Gardel and formulas obtained by assuming T-junction as combination of other pipe components and observations obtained from software experiments. One of the purposes of this study is also to study change in pressure loss with change in angle of T-junction. Using software we can have better view of flow inside the junction and study turbulence, kinetic energy, pressure loss etc. Such simulations save a lot of time and can be performed without actually doing the experiment. There were no real life experiments made, the results obtained completely rely on accuracy of software and numerical methods used.
Resumo:
The purpose of this work is to demonstrate the usefulness of low cost high performance computers. It is presented technics and software packages used by computational chemists. Access to high-performance computing power remains crucial for many computational quantum chemistry. So, this work introduces the concept of PC cluster, an economical computing plataform.
Resumo:
The development of new tools for chemoinformatics, allied to the use of different algorithms and computer programmes for structure elucidation of organic compounds, is growing fast worldwide. Massive efforts in research and development are currently being pursued both by academia and the so-called chemistry software development companies. The demystification of this environment provoked by the availability of software packages and a vast array of publications exert a positive impact on chemistry. In this work, an overview concerning the more classical approaches as well as new strategies on computer-based tools for structure elucidation of organic compounds is presented. Historical background is also taken into account since these techniques began to develop around four decades ago. Attention will be paid to companies which develop, distribute or commercialize software as well as web-based and open access tools which are currently available to chemists.
Resumo:
Research focus of this thesis is to explore options for building systems for business critical web applications. Business criticality here includes requirements for data protection and system availability. The focus is on open source software. Goals are to identify robust technologies and engineering practices to implement such systems. Research methods include experiments made with sample systems built around chosen software packages that represent certain technologies. The main research focused on finding a good method for database data replication, a key functionality for high-availability, database-driven web applications. Research included also finding engineering best practices from books written by administrators of high traffic web applications. Experiment with database replication showed, that block level synchronous replication offered by DRBD replication software offered considerably more robust data protection and high-availability functionality compared to leading open source database product MySQL, and its built-in asynchronous replication. For master-master database setups, block level replication is more recommended way to build high-availability into the system. Based on thesis research, building high-availability web applications is possible using a combination of open source software and engineering best practices for data protection, availability planning and scaling.
Resumo:
Tässä diplomityössä tutkitaan kuinka verkonvalvonta voidaan toteuttaa hajautetussa järjestelmässä. Työssä perehdytään tavallisten tietojärjestelmien ja hajautettujen järjestelmien eroihin, kyseisten järjestelmien ominaispiirteisiin sekä käsitellään mitä verkonvalvonta on yleisellä tasolla ja miten se on yleensä toteutettu tavallisissa tietojärjestelmissä. Tutkitaan tarkemmin kuinka verkonvalvonta voidaan toteuttaa tehokkaasti hajautetussa järjestelmässä sekä mitä vaatimuksia ja haasteita verkonvalvonnassa esiintyy. Tutkimukseen valittiin myös kaksi hajautetun järjestelmän verkonvalvontaan kehitettyä valvontaohjelmistoa sekä yksi laitteistopohjainen ratkaisu joita tutkitaan ja vertaillaan tarkemmin.Selvitetään onko yrityksen kannattavaa ja valvonnan kannalta tehokasta ottaa tämänkaltaista järjestelmää käyttöön. Lopputuloksena työssä on esiteltyinä kuinka verkonvalvonta voidaan toteuttaa hajautetussa järjestelmässä ja miten olemassa olevat haasteet voidaan ratkaista. Toteutusvaihtoehdot tutkittiin ja niistä valittiin paras vaihtoehto (perfSONAR) toteutustavaksi kohdeorganisaation asiakasverkkoyhteyksien valvontaan. Lopuksi esitellään toteutussuunnitelma yrityksen asiakasyhteyksien valvomiseen tarkoitetulle verkonvalvonnalle.
Resumo:
Among the challenges of pig farming in today's competitive market, there is factor of the product traceability that ensures, among many points, animal welfare. Vocalization is a valuable tool to identify situations of stress in pigs, and it can be used in welfare records for traceability. The objective of this work was to identify stress in piglets using vocalization, calling this stress on three levels: no stress, moderate stress, and acute stress. An experiment was conducted on a commercial farm in the municipality of Holambra, São Paulo State , where vocalizations of twenty piglets were recorded during the castration procedure, and separated into two groups: without anesthesia and local anesthesia with lidocaine base. For the recording of acoustic signals, a unidirectional microphone was connected to a digital recorder, in which signals were digitized at a frequency of 44,100 Hz. For evaluation of sound signals, Praat® software was used, and different data mining algorithms were applied using Weka® software. The selection of attributes improved model accuracy, and the best attribute selection was used by applying Wrapper method, while the best classification algorithms were the k-NN and Naive Bayes. According to the results, it was possible to classify the level of stress in pigs through their vocalization.
Resumo:
The application of computational fluid dynamics (CFD) and finite element analysis (FEA) has been growing rapidly in the various fields of science and technology. One of the areas of interest is in biomedical engineering. The altered hemodynamics inside the blood vessels plays a key role in the development of the arterial disease called atherosclerosis, which is the major cause of human death worldwide. Atherosclerosis is often treated with the stenting procedure to restore the normal blood flow. A stent is a tubular, flexible structure, usually made of metals, which is driven and expanded in the blocked arteries. Despite the success rate of the stenting procedure, it is often associated with the restenosis (re-narrowing of the artery) process. The presence of non-biological device in the artery causes inflammation or re-growth of atherosclerotic lesions in the treated vessels. Several factors including the design of stents, type of stent expansion, expansion pressure, morphology and composition of vessel wall influence the restenosis process. Therefore, the role of computational studies is crucial in the investigation and optimisation of the factors that influence post-stenting complications. This thesis focuses on the stent-vessel wall interactions followed by the blood flow in the post-stenting stage of stenosed human coronary artery. Hemodynamic and mechanical stresses were analysed in three separate stent-plaque-artery models. Plaque was modeled as a multi-layer (fibrous cap (FC), necrotic core (NC), and fibrosis (F)) and the arterial wall as a single layer domain. CFD/FEA simulations were performed using commercial software packages in several models mimicking the various stages and morphologies of atherosclerosis. The tissue prolapse (TP) of stented vessel wall, the distribution of von Mises stress (VMS) inside various layers of vessel wall, and the wall shear stress (WSS) along the luminal surface of the deformed vessel wall were measured and evaluated. The results revealed the role of the stenosis size, thickness of each layer of atherosclerotic wall, thickness of stent strut, pressure applied for stenosis expansion, and the flow condition in the distribution of stresses. The thicknesses of FC, and NC and the total thickness of plaque are critical in controlling the stresses inside the tissue. A small change in morphology of artery wall can significantly affect the distribution of stresses. In particular, FC is the most sensitive layer to TP and stresses, which could determine plaque’s vulnerability to rupture. The WSS is highly influenced by the deflection of artery, which in turn is dependent on the structural composition of arterial wall layers. Together with the stenosis size, their roles could play a decisive role in controlling the low values of WSS (<0.5 Pa) prone to restenosis. Moreover, the time dependent flow altered the percentage of luminal area with WSS values less than 0.5 Pa at different time instants. The non- Newtonian viscosity model of the blood properties significantly affects the prediction of WSS magnitude. The outcomes of this investigation will help to better understand the roles of the individual layers of atherosclerotic vessels and their risk to provoke restenosis at the post-stenting stage. As a consequence, the implementation of such an approach to assess the post-stented stresses will assist the engineers and clinicians in optimizing the stenting techniques to minimize the occurrence of restenosis.
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
Thermoaktive Bauteilsysteme sind Bauteile, die als Teil der Raumumschließungsflächen über ein integriertes Rohrsystem mit einem Heiz- oder Kühlmedium beaufschlagt werden können und so die Beheizung oder Kühlung des Raumes ermöglichen. Die Konstruktionenvielfalt reicht nach diesem Verständnis von Heiz, bzw. Kühldecken über Geschoßtrenndecken mit kern-integrierten Rohren bis hin zu den Fußbodenheizungen. Die darin enthaltenen extrem trägen Systeme werden bewußt eingesetzt, um Energieangebot und Raumenergiebedarf unter dem Aspekt der rationellen Energieanwendung zeitlich zu entkoppeln, z. B. aktive Bauteilkühlung in der Nacht, passive Raumkühlung über das kühle Bauteil am Tage. Gebäude- und Anlagenkonzepte, die träge reagierende thermoaktive Bauteilsysteme vorsehen, setzen im kompetenten und verantwortungsvollen Planungsprozeß den Einsatz moderner Gebäudesimulationswerkzeuge voraus, um fundierte Aussagen über Behaglichkeit und Energiebedarf treffen zu können. Die thermoaktiven Bauteilsysteme werden innerhalb dieser Werkzeuge durch Berechnungskomponenten repräsentiert, die auf mathematisch-physikalischen Modellen basieren und zur Lösung des bauteilimmanenten mehrdimensionalen instationären Wärmeleitungsproblems dienen. Bisher standen hierfür zwei unterschiedliche prinzipielle Vorgehensweisen zur Lösung zur Verfügung, die der physikalischen Modellbildung entstammen und Grenzen bzgl. abbildbarer Geometrie oder Rechengeschwindigkeit setzen. Die vorliegende Arbeit dokumentiert eine neue Herangehensweise, die als experimentelle Modellbildung bezeichnet wird. Über den Weg der Systemidentifikation können aus experimentell ermittelten Datenreihen die Parameter für ein kompaktes Black-Box-Modell bestimmt werden, das das Eingangs-Ausgangsverhalten des zugehörigen beliebig aufgebauten thermoaktiven Bauteils mit hinreichender Genauigkeit widergibt. Die Meßdatenreihen lassen sich über hochgenaue Berechnungen generieren, die auf Grund ihrer Detailtreue für den unmittelbaren Einsatz in der Gebäudesimulation ungeeignet wären. Die Anwendung der Systemidentifikation auf das zweidimensionale Wärmeleitungsproblem und der Nachweis ihrer Eignung wird an Hand von sechs sehr unterschiedlichen Aufbauten thermoaktiver Bauteilsysteme durchgeführt und bestätigt sehr geringe Temperatur- und Energiebilanzfehler. Vergleiche zwischen via Systemidentifikation ermittelten Black-Box-Modellen und physikalischen Modellen für zwei Fußbodenkonstruktionen zeigen, daß erstgenannte auch als Referenz für Genauigkeitsabschätzungen herangezogen werden können. Die Praktikabilität des neuen Modellierungsansatzes wird an Fallstudien demonstriert, die Ganzjahressimulationen unter Bauteil- und Betriebsvariationen an einem exemplarischen Büroraum betreffen. Dazu erfolgt die Integration des Black-Box-Modells in das kommerzielle Gebäude- und Anlagensimulationsprogramm CARNOT. Die akzeptablen Rechenzeiten für ein Einzonen-Gebäudemodell in Verbindung mit den hohen Genauigkeiten bescheinigen die Eignung der neuen Modellierungsweise.
Resumo:
This document provides guidelines for fish stock assessment and fishery management using the software tools and other outputs developed by the United Kingdom's Department for International Development's Fisheries Management Science Programme (FMSP) from 1992 to 2004. It explains some key elements of the precautionary approach to fisheries management and outlines a range of alternative stock assessment approaches that can provide the information needed for such precautionary management. Four FMSP software tools, LFDA (Length Frequency Data Analysis), CEDA (Catch Effort Data Analysis), YIELD and ParFish (Participatory Fisheries Stock Assessment), are described with which intermediary parameters, performance indicators and reference points may be estimated. The document also contains examples of the assessment and management of multispecies fisheries, the use of Bayesian methodologies, the use of empirical modelling approaches for estimating yields and in analysing fishery systems, and the assessment and management of inland fisheries. It also provides a comparison of length- and age-based stock assessment methods. A CD-ROM with the FMSP software packages CEDA, LFDA, YIELD and ParFish is included.
Resumo:
This article illustrates that not all statistical software packages are correctly calculating a p-value for the classical F test comparison of two independent Normal variances. This is illustrated with a simple example, and the reasons why are discussed. Eight different software packages are considered.
Resumo:
This paper considers methods for testing for superiority or non-inferiority in active-control trials with binary data, when the relative treatment effect is expressed as an odds ratio. Three asymptotic tests for the log-odds ratio based on the unconditional binary likelihood are presented, namely the likelihood ratio, Wald and score tests. All three tests can be implemented straightforwardly in standard statistical software packages, as can the corresponding confidence intervals. Simulations indicate that the three alternatives are similar in terms of the Type I error, with values close to the nominal level. However, when the non-inferiority margin becomes large, the score test slightly exceeds the nominal level. In general, the highest power is obtained from the score test, although all three tests are similar and the observed differences in power are not of practical importance. Copyright (C) 2007 John Wiley & Sons, Ltd.