974 resultados para Software packages selection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Functional brain imaging techniques such as functional MRI (fMRI) that allow the in vivo investigation of the human brain have been exponentially employed to address the neurophysiological substrates of emotional processing. Despite the growing number of fMRI studies in the field, when taken separately these individual imaging studies demonstrate contrasting findings and variable pictures, and are unable to definitively characterize the neural networks underlying each specific emotional condition. Different imaging packages, as well as the statistical approaches for image processing and analysis, probably have a detrimental role by increasing the heterogeneity of findings. In particular, it is unclear to what extent the observed neurofunctional response of the brain cortex during emotional processing depends on the fMRI package used in the analysis. In this pilot study, we performed a double analysis of an fMRI dataset using emotional faces. The Statistical Parametric Mapping (SPM) version 2.6 (Wellcome Department of Cognitive Neurology, London, UK) and the XBAM 3.4 (Brain Imaging Analysis Unit, Institute of Psychiatry, Kings College London, UK) programs, which use parametric and non-parametric analysis, respectively, were used to assess our results. Both packages revealed that processing of emotional faces was associated with an increased activation in the brain`s visual areas (occipital, fusiform and lingual gyri), in the cerebellum, in the parietal cortex, in the cingulate cortex (anterior and posterior cingulate), and in the dorsolateral and ventrolateral prefrontal cortex. However, blood oxygenation level-dependent (BOLD) response in the temporal regions, insula and putamen was evident in the XBAM analysis but not in the SPM analysis. Overall, SPM and XBAM analyses revealed comparable whole-group brain responses. Further Studies are needed to explore the between-group compatibility of the different imaging packages in other cognitive and emotional processing domains. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A data warehouse is a data repository which collects and maintains a large amount of data from multiple distributed, autonomous and possibly heterogeneous data sources. Often the data is stored in the form of materialized views in order to provide fast access to the integrated data. One of the most important decisions in designing a data warehouse is the selection of views for materialization. The objective is to select an appropriate set of views that minimizes the total query response time with the constraint that the total maintenance time for these materialized views is within a given bound. This view selection problem is totally different from the view selection problem under the disk space constraint. In this paper the view selection problem under the maintenance time constraint is investigated. Two efficient, heuristic algorithms for the problem are proposed. The key to devising the proposed algorithms is to define good heuristic functions and to reduce the problem to some well-solved optimization problems. As a result, an approximate solution of the known optimization problem will give a feasible solution of the original problem. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copyright © 2013 Springer Netherlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

27th Annual Conference of the European Cetacean Society. Setúbal, Portugal, 8-10 April 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote Laboratories or WebLabs constitute a first-order didactic resource in engineering faculties. However, in many cases, they lack a proper software design, both in the client and server side, which degrades their quality and academic usefulness. This paper presents the main characteristics of a Remote Laboratory, analyzes the software technologies to implement the client and server sides in a WebLab, and correlates these technologies with the characteristics to facilitate the selection of a technology to implement a WebLab. The results obtained suggest the adoption of a Service Oriented Laboratory Architecture-based approach for the design of future Remote Laboratories so that client-agnostic Remote Laboratories and Remote Laboratory composition are enabled. The experience with the real Remote Laboratory, WebLab-Deusto, is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

STRIPPING is a software application developed for the automatic design of a randomly packing column where the transfer of volatile organic compounds (VOCs) from water to air can be performed and to simulate it’s behaviour in a steady-state. This software completely purges any need of experimental work for the selection of diameter of the column, and allows a choice, a priori, of the most convenient hydraulic regime for this type of operation. It also allows the operator to choose the model used for the calculation of some parameters, namely between the Eckert/Robbins model and the Billet model for estimating the pressure drop of the gaseous phase, and between the Billet and Onda/Djebbar’s models for the mass transfer. Illustrations of the graphical interface offered are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Materials selection is a matter of great importance to engineering design and software tools are valuable to inform decisions in the early stages of product development. However, when a set of alternative materials is available for the different parts a product is made of, the question of what optimal material mix to choose for a group of parts is not trivial. The engineer/designer therefore goes about this in a part-by-part procedure. Optimizing each part per se can lead to a global sub-optimal solution from the product point of view. An optimization procedure to deal with products with multiple parts, each with discrete design variables, and able to determine the optimal solution assuming different objectives is therefore needed. To solve this multiobjective optimization problem, a new routine based on Direct MultiSearch (DMS) algorithm is created. Results from the Pareto front can help the designer to align his/hers materials selection for a complete set of materials with product attribute objectives, depending on the relative importance of each objective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software product lines (SPL) are diverse systems that are developed using a dual engineering process: (a)family engineering defines the commonality and variability among all members of the SPL, and (b) application engineering derives specific products based on the common foundation combined with a variable selection of features. The number of derivable products in an SPL can thus be exponential in the number of features. This inherent complexity poses two main challenges when it comes to modelling: Firstly, the formalism used for modelling SPLs needs to be modular and scalable. Secondly, it should ensure that all products behave correctly by providing the ability to analyse and verify complex models efficiently. In this paper we propose to integrate an established modelling formalism (Petri nets) with the domain of software product line engineering. To this end we extend Petri nets to Feature Nets. While Petri nets provide a framework for formally modelling and verifying single software systems, Feature Nets offer the same sort of benefits for software product lines. We show how SPLs can be modelled in an incremental, modular fashion using Feature Nets, provide a Feature Nets variant that supports modelling dynamic SPLs, and propose an analysis method for SPL modelled as Feature Nets. By facilitating the construction of a single model that includes the various behaviours exhibited by the products in an SPL, we make a significant step towards efficient and practical quality assurance methods for software product lines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as Nuclear Magnetic Resonance, Gas or Liquid Chromatography, Mass Spectrometry, Infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops stochastic search variable selection (SSVS) for zero-inflated count models which are commonly used in health economics. This allows for either model averaging or model selection in situations with many potential regressors. The proposed techniques are applied to a data set from Germany considering the demand for health care. A package for the free statistical software environment R is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El projecte té com idea fonamental crear un programari que permeti la instal·lació de paquets per mitjà de la xarxa en diverses màquines que es trobin registrades en la Base de dades, i a la vegada, es tenen dos mòduls un per a l'aplicació servidor i altre per a l'aplicació client.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Four-lane undivided roadways in urban areas can experience a degradation of service and/or safety as traffic volumes increase. In fact, the existence of turning vehicles on this type of roadway has a dramatic effect on both of these factors. The solution identified for these problems is typically the addition of a raised median or two-way left-turn lane (TWLTL). The mobility and safety benefits of these actions have been proven and are discussed in the “Past Research” chapter of this report along with some general cross section selection guidelines. The cost and right-of-way impacts of these actions are widely accepted. These guidelines focus on the evaluation and analysis of an alternative to the typical four-lane undivided cross section improvement approach described above. It has been found that the conversion of a four-lane undivided cross section to three lanes (i.e., one lane in each direction and a TWLTL) can improve safety and maintain an acceptable level of service. These guidelines summarize the results of past research in this area (which is almost nonexistent) and qualitative/quantitative before-and-after safety and operational impacts of case study conversions located throughout the United States and Iowa. Past research confirms that this type of conversion is acceptable or feasible in some situations but for the most part fails to specifically identify those situations. In general, the reviewed case study conversions resulted in a reduction of average or 85th percentile speeds (typically less than five miles per hour) and a relatively dramatic reduction in excessive speeding (a 60 to 70 percent reduction in the number of vehicles traveling five miles per hour faster than the posted speed limit was measured in two cases) and total crashes (reductions between 17 to 62 percent were measured). The 13 roadway conversions considered had average daily traffic volumes of 8,400 to 14,000 vehicles per day (vpd) in Iowa and 9,200 to 24,000 vehicles per day elsewhere. In addition to past research and case study results, a simulation sensitivity analysis was completed to investigate and/or confirm the operational impacts of a four-lane undivided to three-lane conversion. First, the advantages and disadvantages of different corridor simulation packages were identified for this type of analysis. Then, the CORridor SIMulation (CORSIM) software was used x to investigate and evaluate several characteristics related to the operational feasibility of a four-lane undivided to three-lane conversion. Simulated speed and level of service results for both cross sections were documented for different total peak-hour traffic, access densities, and access-point left-turn volumes (for a case study corridor defined by the researchers). These analyses assisted with the identification of the considerations for the operational feasibility determination of a four -lane to three-lane conversion. The results of the simulation analyses primarily confirmed the case study impacts. The CORSIM results indicated only a slight decrease in average arterial speed for through vehicles can be expected for a large range of peak-hour volumes, access densities, and access-point left-turn volumes (given the assumptions and design of the corridor case study evaluated). Typically, the reduction in the simulated average arterial speed (which includes both segment and signal delay) was between zero and four miles per hour when a roadway was converted from a four-lane undivided to a three-lane cross section. The simulated arterial level of service for a converted roadway, however, showed a decrease when the bi-directional peak-hour volume was about 1,750 vehicles per hour (or 17,500 vehicles per day if 10 percent of the daily volume is assumed to occur in the peak hour). Past research by others, however, indicates that 12,000 vehicles per day may be the operational capacity (i.e., level of service E) of a three-lane roadway due to vehicle platooning. The simulation results, along with past research and case study results, appear to support following volume-related feasibility suggestions for four-lane undivided to three-lane cross section conversions. It is recommended that a four-lane undivided to three-lane conversion be considered as a feasible (with respect to volume only) option when bi-directional peak-hour volumes are less than 1,500 vehicles per hour, but that some caution begin to be exercised when the roadway has a bi-directional peak-hour volume between 1,500 and 1,750 vehicles per hour. At and above 1,750 vehicles per hour, the simulation indicated a reduction in arterial level of service. Therefore, at least in Iowa, the feasibility of a four-lane undivided to three-lane conversion should be questioned and/or considered much more closely when a roadway has (or is expected to have) a peak-hour volume of more than 1,750 vehicles. Assuming that 10 percent of the daily traffic occurs during the peak-hour, these volume recommendations would correspond to 15,000 and 17,500 vehicles per day, respectively. These suggestions, however, are based on the results from one idealized case xi study corridor analysis. Individual operational analysis and/or simulations should be completed in detail once a four-lane undivided to three-lane cross section conversion is considered feasible (based on the general suggestions above) for a particular corridor. All of the simulations completed as part of this project also incorporated the optimization of signal timing to minimize vehicle delay along the corridor. A number of determination feasibility factors were identified from a review of the past research, before-and-after case study results, and the simulation sensitivity analysis. The existing and expected (i.e., design period) statuses of these factors are described and should be considered. The characteristics of these factors should be compared to each other, the impacts of other potentially feasible cross section improvements, and the goals/objectives of the community. The factors discussed in these guidelines include • roadway function and environment • overall traffic volume and level of service • turning volumes and patterns • frequent-stop and slow-moving vehicles • weaving, speed, and queues • crash type and patterns • pedestrian and bike activity • right-of-way availability, cost, and acquisition impacts • general characteristics, including - parallel roadways - offset minor street intersections - parallel parking - corner radii - at-grade railroad crossings xii The characteristics of these factors are documented in these guidelines, and their relationship to four-lane undivided to three-lane cross section conversion feasibility identified. This information is summarized along with some evaluative questions in this executive summary and Appendix C. In summary, the results of past research, numerous case studies, and the simulation analyses done as part of this project support the conclusion that in certain circumstances a four-lane undivided to three-lane conversion can be a feasible alternative for the mitigation of operational and/or safety concerns. This feasibility, however, must be determined by an evaluation of the factors identified in these guidelines (along with any others that may be relevant for a individual corridor). The expected benefits, costs, and overall impacts of a four-lane undivided to three-lane conversion should then be compared to the impacts of other feasible alternatives (e.g., adding a raised median) at a particular location.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequently the choice of a library management program is conditioned by social, economic and/or political factors that result in the selection of a system that is not altogether suitable for the library’s needs, characteristics and functions. Open source software is quickly becoming a preferred solution, owing to the freedom to copy, modify and distribute it and the freedom from contracts, as well as for greater opportunities for interoperability with other applications. These new trends regarding open source software in libraries are also reflected in LIS studies, as evidenced by the different courses addressing automated programs, repositorymanagement, including the Linux/GNU operating system, among others. The combination of the needs of the centres and the new trends for open source software is the focus of a virtual laboratory for the use of open source software for library applications. It was the result of a project, whose aim was to make a useful contribution to the library community, that was carried out by a group of professors of the School of Library and Information Science of the University of Barcelona, together with a group of students, members of a Working Group on Open Source Software for Information Professionals, of the Professional Library Association of Catalonia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.