974 resultados para Software packages selection
Resumo:
The objective of this study was to develop a model that allows testing in the wind tunnel at high angles of attack and validates its most critical components by analyzing the results of simulations in finite element software. During the project this structure suffered major loads identified during the flight conditions and, from these, we calculated the stresses in critical regions defined as the parts of the model that have higher failure probabilities. All aspects associated with Load methods, mesh refining and stress analysis were taken into account in this approach. The selection of the analysis software was based on project needs, seeking greater ease of modeling and simulation. We opted for the software ANSYS® since the entire project is being developed in CAD platforms enabling a friendly integration between software's modeling and analysis
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A mail survey was conducted to assess current computer hardware use and perceived needs of potential users for software related to crop pest management in Nebraska. Surveys were sent to University of Nebraska-Lincoln agricultural extension agents, agribusiness personnel (including independent crop consultants), and crop producers identified by extension agents as computer users. There were no differences between the groups in several aspects of computer hardware use (percentage computer use, percentage IBM-compatible computer, amount of RAM memory, percentage with hard drive, hard drive size, or monitor graphics capability). Responses were similar among the three groups in several areas that are important to crop pest management (pest identification, pest biology, treatment decision making, control options, and pesticide selection), and a majority of each group expressed the need for additional sources of such information about insects, diseases, and weeds. However, agents mentioned vertebrate pest management information as a need more often than the other two groups. Also, majorities of each group expressed an interest in using computer software, if available, to obtain information in these areas. Appropriate software to address these needs should find an audience among all three groups.
Resumo:
A data set of a commercial Nellore beef cattle selection program was used to compare breeding models that assumed or not markers effects to estimate the breeding values, when a reduced number of animals have phenotypic, genotypic and pedigree information available. This herd complete data set was composed of 83,404 animals measured for weaning weight (WW), post-weaning gain (PWG), scrotal circumference (SC) and muscle score (MS), corresponding to 116,652 animals in the relationship matrix. Single trait analyses were performed by MTDFREML software to estimate fixed and random effects solutions using this complete data. The additive effects estimated were assumed as the reference breeding values for those animals. The individual observed phenotype of each trait was adjusted for fixed and random effects solutions, except for direct additive effects. The adjusted phenotype composed of the additive and residual parts of observed phenotype was used as dependent variable for models' comparison. Among all measured animals of this herd, only 3160 animals were genotyped for 106 SNP markers. Three models were compared in terms of changes on animals' rank, global fit and predictive ability. Model 1 included only polygenic effects, model 2 included only markers effects and model 3 included both polygenic and markers effects. Bayesian inference via Markov chain Monte Carlo methods performed by TM software was used to analyze the data for model comparison. Two different priors were adopted for markers effects in models 2 and 3, the first prior assumed was a uniform distribution (U) and, as a second prior, was assumed that markers effects were distributed as normal (N). Higher rank correlation coefficients were observed for models 3_U and 3_N, indicating a greater similarity of these models animals' rank and the rank based on the reference breeding values. Model 3_N presented a better global fit, as demonstrated by its low DIC. The best models in terms of predictive ability were models 1 and 3_N. Differences due prior assumed to markers effects in models 2 and 3 could be attributed to the better ability of normal prior in handle with collinear effects. The models 2_U and 2_N presented the worst performance, indicating that this small set of markers should not be used to genetically evaluate animals with no data, since its predictive ability is restricted. In conclusion, model 3_N presented a slight superiority when a reduce number of animals have phenotypic, genotypic and pedigree information. It could be attributed to the variation retained by markers and polygenic effects assumed together and the normal prior assumed to markers effects, that deals better with the collinearity between markers. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This study aimed at enumerating molds (heat-labile and heat-resistant) on the surface of paperboard material to be filled with tomato pulps through an aseptic system and at determining the most heat-and hydrogen peroxide-resistant strains. A total of 118 samples of laminated paperboard before filling were collected, being 68 before and 50 after the hydrogen peroxide bath. Seven molds, including heat-resistant strains (Penicillium variotii and Talaromyces flavus) with counts ranging between 0.71 and 1.02 CFU/cm(2) were isolated. P. variotii was more resistant to hydrogen peroxide than T. flavus and was inactivated after heating at 85 degrees C/15 min. When exposed to 35 % hydrogen peroxide at 25 degrees C, T. flavus (F5E2) and N. fischeri (control) were less resistant than P. variotti (F1A1). P. citrinum (F7E2) was shown to be as resistant as P. variotti. The D values (the time to cause one logarithmic cycle reduction in a microbial population at a determined temperature) for spores of P. variotii (F1A1) and N. fischeri (control) with 4 months of age at 85 and 90 degrees C were 3.9 and 4.5 min, respectively. Although the contamination of packages was low, the presence of heat-and chemical-resistant molds may be of concern for package sterility and product stability during shelf-life. To our knowledge, this is the first report that focuses on the isolation of molds, including heat-resistant ones, contaminating paperboard packaging material and on estimating their resistance to the chemical and physical processes used for packaging sterilization.
Resumo:
Photodynamic therapy (PDT) is a treatment modality that has advanced rapidly in recent years. It causes tissue and vascular damage with the interaction of a photosensitizing agent (PS), light of a proper wavelength, and molecular oxygen. Evaluation of vessel damage usually relies on histopathology evaluation. Results are often qualitative or at best semi-quantitative based on a subjective system. The aim of this study was to evaluate, using CD31 immunohistochem- istry and image analysis software, the vascular damage after PDT in a well-established rodent model of chemically induced mammary tumor. Fourteen Sprague-Dawley rats received a single dose of 7,12-dimethylbenz(a)anthraxcene (80 mg/kg by gavage), treatment efficacy was evaluated by comparing the vascular density of tumors after treatment with Photogem® as a PS, intraperitoneally, followed by interstitial fiber optic lighting, from a diode laser, at 200 mW/cm and light dose of 100 J/cm directed against his tumor (7 animals), with a control group (6 animals, no PDT). The animals were euthanized 30 hours after the lighting and mammary tumors were removed and samples from each lesion were formalin-fixed. Immunostained blood vessels were quantified by Image Pro-Plus version 7.0. The control group had an average of 3368.6 ± 4027.1 pixels per picture and the treated group had an average of 779 ± 1242.6 pixels per area (P < 0.01), indicating that PDT caused a significant decrease in vascular density of mammary tumors. The combined immu- nohistochemistry using CD31, with selection of representative areas by a trained pathology, followed by quantification of staining using Image Pro-Plus version 7.0 system was a practical and robust methodology for vessel damage evalua- tion, which probably could be used to assess other antiangiogenic treatments.
Resumo:
In the present study we are using multi variate analysis techniques to discriminate signal from background in the fully hadronic decay channel of ttbar events. We give a brief introduction to the role of the Top quark in the standard model and a general description of the CMS Experiment at LHC. We have used the CMS experiment computing and software infrastructure to generate and prepare the data samples used in this analysis. We tested the performance of three different classifiers applied to our data samples and used the selection obtained with the Multi Layer Perceptron classifier to give an estimation of the statistical and systematical uncertainty on the cross section measurement.
Resumo:
Il tumore al seno è il più comune tra le donne nel mondo. La radioterapia è comunemente usata dopo la chirurgia per distruggere eventuali cellule maligne rimaste nel volume del seno. Nei trattamenti di radioterapia bisogna cercare di irradiare il volume da curare limitando contemporaneamente la tossicità nei tessuti sani. In clinica i parametri che definiscono il piano di trattamento radioterapeutico sono selezionati manualmente utilizzando un software di simulazione per trattamenti. Questo processo, detto di trial and error, in cui i differenti parametri vengono modificati e il trattamento viene simulato nuovamente e valutato, può richiedere molte iterazioni rendendolo dispendioso in termini di tempo. Lo studio presentato in questa tesi si concentra sulla generazione automatica di piani di trattamento per irradiare l'intero volume del seno utilizzando due fasci approssimativamente opposti e tangenti al paziente. In particolare ci siamo concentrati sulla selezione delle direzioni dei fasci e la posizione dell'isocentro. A questo scopo, è stato investigata l'efficacia di un approccio combinatorio, nel quale sono stati generati un elevato numero di possibili piani di trattamento utilizzando differenti combinazioni delle direzioni dei due fasci. L'intensità del profilo dei fasci viene ottimizzata automaticamente da un algoritmo, chiamato iCycle, sviluppato nel ospedale Erasmus MC di Rotterdam. Inizialmente tra tutti i possibili piani di trattamento generati solo un sottogruppo viene selezionato, avente buone caratteristiche per quel che riguarda l'irraggiamento del volume del seno malato. Dopo di che i piani che mostrano caratteristiche ottimali per la salvaguardia degli organi a rischio (cuore, polmoni e seno controlaterale) vengono considerati. Questi piani di trattamento sono matematicamente equivalenti quindi per selezionare tra questi il piano migliore è stata utilizzata una somma pesata dove i pesi sono stati regolati per ottenere in media piani che abbiano caratteristiche simili ai piani di trattamento approvati in clinica. Questo metodo in confronto al processo manuale oltre a ridurre considerevol-mente il tempo di generazione di un piano di trattamento garantisce anche i piani selezionati abbiano caratteristiche ottimali nel preservare gli organi a rischio. Inizialmente è stato utilizzato l'isocentro scelto in clinica dal tecnico. Nella parte finale dello studio l'importanza dell'isocentro è stata valutata; ne è risultato che almeno per un sottogruppo di pazienti la posizione dell'isocentro può dare un importante contributo alla qualità del piano di trattamento e quindi potrebbe essere un ulteriore parametro da ottimizzare.
Resumo:
Osteoarticular allograft is one possible treatment in wide surgical resections with large defects. Performing best osteoarticular allograft selection is of great relevance for optimal exploitation of the bone databank, good surgery outcome and patient’s recovery. Current approaches are, however, very time consuming hindering these points in practice. We present a validation study of a software able to perform automatic bone measurements used to automatically assess the distal femur sizes across a databank. 170 distal femur surfaces were reconstructed from CT data and measured manually using a size measure protocol taking into account the transepicondyler distance (A), anterior-posterior distance in medial condyle (B) and anterior-posterior distance in lateral condyle (C). Intra- and inter-observer studies were conducted and regarded as ground truth measurements. Manual and automatic measures were compared. For the automatic measurements, the correlation coefficients between observer one and automatic method, were of 0.99 for A measure and 0.96 for B and C measures. The average time needed to perform the measurements was of 16 h for both manual measurements, and of 3 min for the automatic method. Results demonstrate the high reliability and, most importantly, high repeatability of the proposed approach, and considerable speed-up on the planning.
Resumo:
High-throughput gene expression technologies such as microarrays have been utilized in a variety of scientific applications. Most of the work has been on assessing univariate associations between gene expression with clinical outcome (variable selection) or on developing classification procedures with gene expression data (supervised learning). We consider a hybrid variable selection/classification approach that is based on linear combinations of the gene expression profiles that maximize an accuracy measure summarized using the receiver operating characteristic curve. Under a specific probability model, this leads to consideration of linear discriminant functions. We incorporate an automated variable selection approach using LASSO. An equivalence between LASSO estimation with support vector machines allows for model fitting using standard software. We apply the proposed method to simulated data as well as data from a recently published prostate cancer study.
Resumo:
The IDE used in most Smalltalk dialects such as Pharo, Squeak or Cincom Smalltalk did not evolve significantly over the last years, if not to say decades. For other languages, for instance Java, the available IDEs made tremendous progress as Eclipse or NetBeans illustrate. While the Smalltalk IDE served as an exemplar for many years, other IDEs caught up or even overtook the erstwhile leader in terms of feature-richness, usability, or code navigation facilities. In this paper we first analyze the difficulty of software navigation in the Smalltalk IDE and second illustrate with concrete examples the features we added to the Smalltalk IDE to fill the gap to modern IDEs and to provide novel, improved means to navigate source space. We show that thanks to the agility and dynamics of Smalltalk, we are able to extend and enhance with reasonable effort the Smalltalk IDE to better support software navigation, program comprehension, and software maintenance in general. One such support is the integration of dynamic information into the static source views we are familiar with. Other means include easing the access to static information (for instance by better arranging important packages) or helping developers re-locating artifacts of interest (for example with a categorization system such as smart groups).
Resumo:
We present the results of an investigation into the nature of the information needs of software developers who work in projects that are part of larger ecosystems. In an open- question survey we asked framework and library developers about their information needs with respect to both their upstream and downstream projects. We investigated what kind of information is required, why is it necessary, and how the developers obtain this information. The results show that the downstream needs are grouped into three categories roughly corresponding to the different stages in their relation with an upstream: selection, adop- tion, and co-evolution. The less numerous upstream needs are grouped into two categories: project statistics and code usage. The current practices part of the study shows that to sat- isfy many of these needs developers use non-specific tools and ad hoc methods. We believe that this is a largely unexplored area of research.
Resumo:
These guidelines were developed in the context of working block 3 of the DESIRE project. They address the facilitators in the 18 DESIRE study sites and support them in conducting stakeholder workshops aiming at the selection and decision on mitigation strategies to be implemented in the study site context. The decision-making process is supported by a multi-objective decision support system (MODSS) Software called 'Facilitator'.