974 resultados para Software packages selection


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Once the preserve of university academics and research laboratories with high-powered and expensive computers, the power of sophisticated mathematical fire models has now arrived on the desk top of the fire safety engineer. It is a revolution made possible by parallel advances in PC technology and fire modelling software. But while the tools have proliferated, there has not been a corresponding transfer of knowledge and understanding of the discipline from expert to general user. It is a serious shortfall of which the lack of suitable engineering courses dealing with the subject is symptomatic, if not the cause. The computational vehicles to run the models and an understanding of fire dynamics are not enough to exploit these sophisticated tools. Too often, they become 'black boxes' producing magic answers in exciting three-dimensional colour graphics and client-satisfying 'virtual reality' imagery. As well as a fundamental understanding of the physics and chemistry of fire, the fire safety engineer must have at least a rudimentary understanding of the theoretical basis supporting fire models to appreciate their limitations and capabilities. The five day short course, "Principles and Practice of Fire Modelling" run by the University of Greenwich attempt to bridge the divide between the expert and the general user, providing them with the expertise they need to understand the results of mathematical fire modelling. The course and associated text book, "Mathematical Modelling of Fire Phenomena" are aimed at students and professionals with a wide and varied background, they offer a friendly guide through the unfamiliar terrain of mathematical modelling. These concepts and techniques are introduced and demonstrated in seminars. Those attending also gain experience in using the methods during "hands-on" tutorial and workshop sessions. On completion of this short course, those participating should: - be familiar with the concept of zone and field modelling; - be familiar with zone and field model assumptions; - have an understanding of the capabilities and limitations of modelling software packages for zone and field modelling; - be able to select and use the most appropriate mathematical software and demonstrate their use in compartment fire applications; and - be able to interpret model predictions. The result is that the fire safety engineer is empowered to realise the full value of mathematical models to help in the prediction of fire development, and to determine the consequences of fire under a variety of conditions. This in turn enables him or her to design and implement safety measures which can potentially control, or at the very least reduce the impact of fire.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. METHODS: Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). RESULTS: Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). CONCLUSIONS: Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Starting in December 1982 the University of Nottingham decided to phototypeset almost all of its examination papers `in house' using the troff, tbl and eqn programs running under UNIX. This tutorial lecture highlights the features of the three programs with particular reference to their strengths and weaknesses in a production environment. The following issues are particularly addressed: Standards -- all three software packages require the embedding of commands and the invocation of pre-written macros, rather than `what you see is what you get'. This can help to enforce standards, in the absence of traditional compositor skills. Hardware and Software -- the requirements are analysed for an inexpensive preview facility and a low-level interface to the phototypesetter. Mathematical and Technical papers -- the fine-tuning of eqn to impose a standard house style. Staff skills and training -- systems of this kind do not require the operators to have had previous experience of phototypesetting. Of much greater importance is willingness and flexibility in learning how to use computer systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction Prediction of soft tissue changes following orthognathic surgery has been frequently attempted in the past decades. It has gradually progressed from the classic “cut and paste” of photographs to the computer assisted 2D surgical prediction planning; and finally, comprehensive 3D surgical planning was introduced to help surgeons and patients to decide on the magnitude and direction of surgical movements as well as the type of surgery to be considered for the correction of facial dysmorphology. A wealth of experience was gained and numerous published literature is available which has augmented the knowledge of facial soft tissue behaviour and helped to improve the ability to closely simulate facial changes following orthognathic surgery. This was particularly noticed following the introduction of the three dimensional imaging into the medical research and clinical applications. Several approaches have been considered to mathematically predict soft tissue changes in three dimensions, following orthognathic surgery. The most common are the Finite element model and Mass tensor Model. These were developed into software packages which are currently used in clinical practice. In general, these methods produce an acceptable level of prediction accuracy of soft tissue changes following orthognathic surgery. Studies, however, have shown a limited prediction accuracy at specific regions of the face, in particular the areas around the lips. Aims The aim of this project is to conduct a comprehensive assessment of hard and soft tissue changes following orthognathic surgery and introduce a new method for prediction of facial soft tissue changes.   Methodology The study was carried out on the pre- and post-operative CBCT images of 100 patients who received their orthognathic surgery treatment at Glasgow dental hospital and school, Glasgow, UK. Three groups of patients were included in the analysis; patients who underwent Le Fort I maxillary advancement surgery; bilateral sagittal split mandibular advancement surgery or bimaxillary advancement surgery. A generic facial mesh was used to standardise the information obtained from individual patient’s facial image and Principal component analysis (PCA) was applied to interpolate the correlations between the skeletal surgical displacement and the resultant soft tissue changes. The identified relationship between hard tissue and soft tissue was then applied on a new set of preoperative 3D facial images and the predicted results were compared to the actual surgical changes measured from their post-operative 3D facial images. A set of validation studies was conducted. To include: • Comparison between voxel based registration and surface registration to analyse changes following orthognathic surgery. The results showed there was no statistically significant difference between the two methods. Voxel based registration, however, showed more reliability as it preserved the link between the soft tissue and skeletal structures of the face during the image registration process. Accordingly, voxel based registration was the method of choice for superimposition of the pre- and post-operative images. The result of this study was published in a refereed journal. • Direct DICOM slice landmarking; a novel technique to quantify the direction and magnitude of skeletal surgical movements. This method represents a new approach to quantify maxillary and mandibular surgical displacement in three dimensions. The technique includes measuring the distance of corresponding landmarks digitized directly on DICOM image slices in relation to three dimensional reference planes. The accuracy of the measurements was assessed against a set of “gold standard” measurements extracted from simulated model surgery. The results confirmed the accuracy of the method within 0.34mm. Therefore, the method was applied in this study. The results of this validation were published in a peer refereed journal. • The use of a generic mesh to assess soft tissue changes using stereophotogrammetry. The generic facial mesh played a major role in the soft tissue dense correspondence analysis. The conformed generic mesh represented the geometrical information of the individual’s facial mesh on which it was conformed (elastically deformed). Therefore, the accuracy of generic mesh conformation is essential to guarantee an accurate replica of the individual facial characteristics. The results showed an acceptable overall mean error of the conformation of generic mesh 1 mm. The results of this study were accepted for publication in peer refereed scientific journal. Skeletal tissue analysis was performed using the validated “Direct DICOM slices landmarking method” while soft tissue analysis was performed using Dense correspondence analysis. The analysis of soft tissue was novel and produced a comprehensive description of facial changes in response to orthognathic surgery. The results were accepted for publication in a refereed scientific Journal. The main soft tissue changes associated with Le Fort I were advancement at the midface region combined with widening of the paranasal, upper lip and nostrils. Minor changes were noticed at the tip of the nose and oral commissures. The main soft tissue changes associated with mandibular advancement surgery were advancement and downward displacement of the chin and lower lip regions, limited widening of the lower lip and slight reversion of the lower lip vermilion combined with minimal backward displacement of the upper lip were recorded. Minimal changes were observed on the oral commissures. The main soft tissue changes associated with bimaxillary advancement surgery were generalized advancement of the middle and lower thirds of the face combined with widening of the paranasal, upper lip and nostrils regions. In Le Fort I cases, the correlation between the changes of the facial soft tissue and the skeletal surgical movements was assessed using PCA. A statistical method known as ’Leave one out cross validation’ was applied on the 30 cases which had Le Fort I osteotomy surgical procedure to effectively utilize the data for the prediction algorithm. The prediction accuracy of soft tissue changes showed a mean error ranging between (0.0006mm±0.582) at the nose region to (-0.0316mm±2.1996) at the various facial regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The health of people living with HIV and AIDS (PLWHA) is nutritionally challenged in many nations of the world. The scourge has reduced socio-economic progress globally and more so in sub-Saharan Africa (SSA) where its impact has been compounded by poverty and food insecurity. Good nutrition with proper drug use improves the quality of life for those infected but it is not known how PLWHA exposed to chronic malnutrition and food shortages from developing nations adjust their nutrition with use of Anti-Retro-viral Drugs (ARVs). This study assessed nutritional status, dietary practices, and dietary management of common illnesses that hinder daily food intake by the patients and use of ARVs with food recommendations provided by the health care givers. A descriptive case study design was used to sample 120 HIV-infected patients using systematic sampling procedure. These patients sought health care from an urban slum, Kibera AMREF clinic. Data were collected by anthropometric measurements, bio-chemical analysis, semi-structured questionnaire and secondary data. The Statistical Package for Social Sciences (SPSS) and the Nutri-Survey software packages were used to analyze data. Dietary intakes of micro-nutrients were inadequate for >70% of the patients when compared to the Recommended Daily Requirements. When Body Mass Indices (BMI) were used, only 6.7% of the respondents were underweight (BMI<18.5kg/m2) and 9.2% were overweight (BMI> 25kg/m2), serum albumin test results (mean 3.34±0.06g/dl) showed 60.8% of the respondents were protein deficient and this was confirmed by low dietary protein intakes. The BMI was not related to dietary nutrient intakes, serum albumin and CD4 cell counts (p>0.05). It appeared that there was no significant difference in BMI readings at different categories of CD4 cell count (p>0.05) suggesting that the level of immunity did not affect weight gain with ARV as observed in many studies from developed countries. Malnutrition was, therefore, evident among the 60.8% of the cases as identified by serum albumin tests and food intake was not adequate (68%) for the patients as they ate once a day due to lack of food. National food and nutrition policy should incorporate food security boosting guidelines for the poor people infected with HIV and using ARVs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

O fenômeno "Born global" refere-se a empresas que consideram o mercado global como seu contexto natural e que iniciam seu processo de internacionalização muito cedo após sua criação. As teorias tradicionais como o modelo de Uppsala não conseguem explicar este processo. Portanto, outras teorias têm surgido, como a perspectiva de redes. Existem alguns estudos relacionados a esta área, principalmente realizados em países desenvolvidos com pequenos mercados e economias abertas. No entanto, poucos estudos têm sido feitos em economias em desenvolvimento. Além disso, o número de pesquisas quanto à escolha do modo de entrada e seleção de mercados das empresas “born global” é bastante limitado. Consequentemente, este estudo pretende descrever os principais fatores que influenciam a escolha do modo de entrada e seleção de mercados das empresas, de economias em desenvolvimento, nascidas globais. O foco da pesquisa é a indústria de software e um estudo de casos múltiplo foi realizado com três empresas no Equador. A metodologia incluiu entrevistas com fundadores, bem como a coleta de dados secundários. Com base na evidência empírica, verificou-se que os principais fatores que influenciam a escolha do modo de entrada são as restrições financeiras, as receitas esperadas, a velocidade de internacionalização, mercados nicho e a experiência empresarial anterior dos fundadores. Por outro lado, a seleção de mercado é influenciada por semelhanças de língua e cultura, mercados nicho e relações em rede.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

When a firm decides to implement ERP softwares, the resulting consequences can pervade all levels, includ- ing organization, process, control and available information. Therefore, the first decision to be made is which ERP solution must be adopted from a wide range of offers and vendors. To this end, this paper describes a methodology based on multi-criteria factors that directly affects the process to help managers make this de- cision. This methodology has been applied to a medium-size company in the Spanish metal transformation sector which is interested in updating its IT capabilities in order to obtain greater control of and better infor- mation about business, thus achieving a competitive advantage. The paper proposes a decision matrix which takes into account all critical factors in ERP selection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Developers of interactive software are confronted by an increasing variety of software tools to help engineer the interactive aspects of software applications. Not only do these tools fall into different categories in terms of functionality, but within each category there is a growing number of competing tools with similar, although not identical, features. Choice of user interface development tool (UIDT) is therefore becoming increasingly complex.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

SELECTOR is a software package for studying the evolution of multiallelic genes under balancing or positive selection while simulating complex evolutionary scenarios that integrate demographic growth and migration in a spatially explicit population framework. Parameters can be varied both in space and time to account for geographical, environmental, and cultural heterogeneity. SELECTOR can be used within an approximate Bayesian computation estimation framework. We first describe the principles of SELECTOR and validate the algorithms by comparing its outputs for simple models with theoretical expectations. Then, we show how it can be used to investigate genetic differentiation of loci under balancing selection in interconnected demes with spatially heterogeneous gene flow. We identify situations in which balancing selection reduces genetic differentiation between population groups compared with neutrality and explain conflicting outcomes observed for human leukocyte antigen loci. These results and three previously published applications demonstrate that SELECTOR is efficient and robust for building insight into human settlement history and evolution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This proposal shows that ACO systems can be applied to problems of requirements selection in software incremental development, with the idea of obtaining better results of those produced by expert judgment alone. The evaluation of the ACO systems should be done through a compared analysis with greedy and simulated annealing algorithms, performing experiments with some problems instances

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Feature selection is a pattern recognition approach to choose important variables according to some criteria in order to distinguish or explain certain phenomena (i.e., for dimensionality reduction). There are many genomic and proteomic applications that rely on feature selection to answer questions such as selecting signature genes which are informative about some biological state, e. g., normal tissues and several types of cancer; or inferring a prediction network among elements such as genes, proteins and external stimuli. In these applications, a recurrent problem is the lack of samples to perform an adequate estimate of the joint probabilities between element states. A myriad of feature selection algorithms and criterion functions have been proposed, although it is difficult to point the best solution for each application. Results: The intent of this work is to provide an open-source multiplataform graphical environment for bioinformatics problems, which supports many feature selection algorithms, criterion functions and graphic visualization tools such as scatterplots, parallel coordinates and graphs. A feature selection approach for growing genetic networks from seed genes ( targets or predictors) is also implemented in the system. Conclusion: The proposed feature selection environment allows data analysis using several algorithms, criterion functions and graphic visualization tools. Our experiments have shown the software effectiveness in two distinct types of biological problems. Besides, the environment can be used in different pattern recognition applications, although the main concern regards bioinformatics tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.