955 resultados para standard package software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to carry out Biometric studies, 75 samples were caught from 3 locations ( Tajan river, Sefidrud and Shirud) using Salic and the length (±1 mm) and weights (± 5 gr) of samples were determined. Using One-way ANOVA by SPPSS software, there wasn’t significant difference between locations in length and fecondity (P ≥0.01(, but there was significant difference between Shirud and tajan samples with sefidrud in weight ) P≤0.01(. In order to carry out genetic variation studies, 210 fish were caught from 3 different regions of the Iranian coastline (Khoshkrud, Tonekabon, Gorganrud) and 1 region in Azerbaijan (Waters of the Caspian Sea close to Kura River mouth) during 2008-2009 . Genomic DNA was extracted of fin using the phenol-chloroform. The quantity and quality of DNA from samples were assessed by spectrophptometer and 1% agarose gel electro-phoresis. PCR was carried out using 15 paired microsatellite primers. PCR products were separated on 8% polyacrylamide gels that were stained using silver nitrate. Molecular weight calculate using UVTech software. The recorded microsatellite genotypes were used as input data for the GENALEX software version 6 package in order to calculate allele and genotype frequencies, observed (Ho) and (He) expected heterozygosities and to test for deviations from Hardy-Weinberg equilibrium. Genetic distance between two populations was estimated from Nei standard genetic distance and genetic similarity index (Nei, 1972). Genetic differentiation between populations was also evaluated by the calculation of pairwise estimates of Fst and Rst values. From 15 SSR markers were used in this investigation, 9 of them were polymorph. Average of expected and observed heterozygosity was 0.54 and 0.49 respectively. Significant deviations from Hardy-Weinberg expectations were observed in all of location except Anzali lagoon- autumn in AF277576 and EF144125, Khoshkrud in EF144125 and Gorganrud and Kura in AF277576. Using Fst and Rst there was significant difference between locations ) P≤0.01(. According to Fst , the highest population differentiation (Fst= 0.217) was between Gorganrud and Khoshkrud that have the lowest Nm and the lowest (Fst= 0.086) was between Gorganrud and Tonekabon that have the highest Nm. Using Rst the highest population differentiation (Rst= 0.271) was between Tonekabon and spring Anzali lagoon and the lowest (Rst= 0.026) was between Tonekabon and Autumn Anzali 159 lagoon. Also the difference between Spring Anzali lagoon and Autumn Anzali lagoon was noticeable (Fst=0.15). AMOVA analysis with consideration of 2 sampling regions (Iran and Azerbaijan) and 7 sampling locations (Iran: Khoshkrud, Tonekabon, Gorganrud, Spring Anzali lagoon and Autumn Anzali lagoon ; Azerbaijan: the Kura mouth) revealed that almost all of the variance in data namely 83% )P≤0.01( was within locations, Genetic variances among locations was 14% )P≤0.01( and among regions was 3% )P≤0.01(. The genetic distance was the highest (0.646) between Gorganrud and Autumn Anzali lagoon populations, whereas the lowest distance (0.237) was between Gorganrud and Tonekabon River. Result obtained from the present study show that at least 2 different population of Rutilus frissi kutum are found in the Caspian sea,which are including the kura river population and the southern Caspian sea samples and it appears that there is more than one population in southern Caspian sea that should be attantioned in artifical reproduction Center and stoke rebilding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BGCore is a software package for comprehensive computer simulation of nuclear reactor systems and their fuel cycles. The BGCore interfaces Monte Carlo particles transport code MCNP4C with a SARAF module - an independently developed code for calculating in-core fuel composition and spent fuel emissions following discharge. In BGCore system, depletion coupling methodology is based on the multi-group approach that significantly reduces computation time and allows tracking of large number of nuclides during calculations. In this study, burnup calculation capabilities of BGCore system were validated against well established and verified, computer codes for thermal and fast spectrum lattices. Very good agreement in k eigenvalue and nuclide densities prediction was observed for all cases under consideration. In addition, decay heat prediction capabilities of the BGCore system were benchmarked against the most recent edition of ANS Standard methodology for UO2 fuel decay power prediction in LWRs. It was found that the difference between ANS standard data and that predicted by the BGCore does not exceed 5%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The density and distribution of spatial samples heavily affect the precision and reliability of estimated population attributes. An optimization method based on Mean of Surface with Nonhomogeneity (MSN) theory has been developed into a computer package with the purpose of improving accuracy in the global estimation of some spatial properties, given a spatial sample distributed over a heterogeneous surface; and in return, for a given variance of estimation, the program can export both the optimal number of sample units needed and their appropriate distribution within a specified research area. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crosshole Seismic tomography has been broadly studied and applied in the fields of resource exploration and engineering exploration because of its special observing manner and better resolution than normal seismic exploration. This thesis will state the theory and method of Crosshole Seismic tomography. Basing on the previous studies,the thesis studied the initial velocity model,ray-tracing method, and developed the three-dimension tomography software. All the cells that a ray passes through are of the same velocities if the paths from transmitters to receivers are straight. The cells that the each ray passes through are recorded, and rays that pass through each cell are calculated. The ray average velocity which passes through a cell is set as the cell velocity. Analogously we can make a initial node velocity model because the velocity sum is calculated on the all cells which own to a certain node, and the cell number is summed about each nodes,the ratio of the velocity sum to the all cells number is set as the node velocity. The inversion result from the initial node velocity model is better than that of the average velocity model. Ray-bending and Shortest Path for Rays (SPR) have shortcomings and limitations respectively. Using crooked rays obtained from SPR rather than straight lines as the starting point can not only avoid ray bending converging to the local minimum travel time path, but also settle the no smooth ray problem obtained by SPR. The hybrid method costs much computation time, which is roughly equal to the time that SPR expends. The Delphi development tool based on the Object Pascal language standard has an advantage of object-oriented. TDTOM (Three Dimensions Tomography) was developed by using Delphi from the DOS version. Improvement on the part of inversion was made, which bring faster convergence velocity. TDTOM can be used to do velocity tomography from the first arrival travel time of the seismic wave, and it has the good qualities of friendly user interface and convenient operation. TDTOM is used to reconstruct the velocity image for a set of crosshole data from Karamay Oil Field. The geological explanation is then given by comparing the inversion effects of different ray-tracing methods. High velocity zones mean the cover of oil reservoir, and low velocity zones correspond to the reservoir or the steam flooding layer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Program design is an area of programming that can benefit significantly from machine-mediated assistance. A proposed tool, called the Design Apprentice (DA), can assist a programmer in the detailed design of programs. The DA supports software reuse through a library of commonly-used algorithmic fragments, or cliches, that codifies standard programming. The cliche library enables the programmer to describe the design of a program concisely. The DA can detect some kinds of inconsistencies and incompleteness in program descriptions. It automates detailed design by automatically selecting appropriate algorithms and data structures. It supports the evolution of program designs by keeping explicit dependencies between the design decisions made. These capabilities of the DA are underlaid bya model of programming, called programming by successive elaboration, which mimics the way programmers interact. Programming by successive elaboration is characterized by the use of breadth-first exposition of layered program descriptions and the successive modifications of descriptions. A scenario is presented to illustrate the concept of the DA. Technques for automating the detailed design process are described. A framework is given in which designs are incrementally augmented and modified by a succession of design steps. A library of cliches and a suite of design steps needed to support the scenario are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a management tool Similation Software deserves greater analysis from both an academic and industrial viewpoint. A comparative study of three packages was carried out from a 'first time' use approach. This allowed the ease of use and package features to be assessed using a simple theoretical benchmark manufacturing process. To back the use of these packages an objective survey on simulation use and package features was carried out within the manufacturing industry.This identified the use of simulation software, its' applicability and preception of user requirements thereby proposing an ideal package.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Administrative or quality improvement registries may or may not contain the elements needed for investigations by trauma researchers. International Classification of Diseases Program for Injury Categorisation (ICDPIC), a statistical program available through Stata, is a powerful tool that can extract injury severity scores from ICD-9-CM codes. We conducted a validation study for use of the ICDPIC in trauma research. METHODS: We conducted a retrospective cohort validation study of 40,418 patients with injury using a large regional trauma registry. ICDPIC-generated AIS scores for each body region were compared with trauma registry AIS scores (gold standard) in adult and paediatric populations. A separate analysis was conducted among patients with traumatic brain injury (TBI) comparing the ICDPIC tool with ICD-9-CM embedded severity codes. Performance in characterising overall injury severity, by the ISS, was also assessed. RESULTS: The ICDPIC tool generated substantial correlations in thoracic and abdominal trauma (weighted κ 0.87-0.92), and in head and neck trauma (weighted κ 0.76-0.83). The ICDPIC tool captured TBI severity better than ICD-9-CM code embedded severity and offered the advantage of generating a severity value for every patient (rather than having missing data). Its ability to produce an accurate severity score was consistent within each body region as well as overall. CONCLUSIONS: The ICDPIC tool performs well in classifying injury severity and is superior to ICD-9-CM embedded severity for TBI. Use of ICDPIC demonstrates substantial efficiency and may be a preferred tool in determining injury severity for large trauma datasets, provided researchers understand its limitations and take caution when examining smaller trauma datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years there has been an increase in the use of generic Computational Fluid Dynamics (CFD) software packages spread across various application fields. This has created the need for the integration of expertise into CFD software. Expertise can be integrated into CFD software in the form of an Intelligent Knowledge-Based System (IKBS). The advantages of integrating intelligence into generic engineering software are discussed with a special view to software engineering considerations. The software modelling cycle of a typical engineering problem is identified and the respective expertise and user control needed for each modelling phase is shown. The requirements of an IKBS for CFD software are discussed and compared to current practice. The blackboard software architecture is presented. This is shown to be appropriate for the integration of an IKBS into an engineering software package. This is demonstrated through the presentation of the prototype CFD software package FLOWES.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predicting the reliability of newly designed products, before manufacture, is obviously highly desirable for many organisations. Understanding the impact of various design variables on reliability allows companies to optimise expenditure and release a package in minimum time. Reliability predictions originated in the early years of the electronics industry. These predictions were based on historical field data which has evolved into industrial databases and specifications such as the famous MIL-HDBK-217 standard, plus numerous others. Unfortunately the accuracy of such techniques is highly questionable especially for newly designed packages. This paper discusses the use of modelling to predict the reliability of high density flip-chip and BGA components. A number of design parameters are investigated at the assembly stage, during testing, and in-service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For sensitive optoelectronic components, traditional soldering techniques cannot be used because of their inherent sensitivity to thermal stresses. One such component is the Optoelectronic Butterfly Package which houses a laser diode chip aligned to a fibre-optic cable. Even sub-micron misalignment of the fibre optic and laser diode chip can significantly reduce the performance of the device. The high cost of each unit requires that the number of damaged components, via the laser soldering process, are kept to a minimum. Mathematical modelling is undertaken to better understand the laser soldering process and to optimize operational parameters such as solder paste volume, copper pad dimensions, laser solder times for each joint, laser intensity and absorption coefficient. Validation of the model against experimental data will be completed, and will lead to an optimization of the assembly process, through an iterative modelling cycle. This will ultimately reduce costs, improve the process development time and increase consistency in the laser soldering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes ongoing research on developing a portal framework based on the OASIS Web Services for Remote Portlets (WSRP) standard for integration of Web-based education contents and services made available through a model for a European Networked University. We first identify the requirements for such a framework that supports integration at the presentation level and collaboration in developing and updating study programmes and course materials. We then outline the architecture design, and report on the initial implementation and preliminary evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The newly formed Escape and Evacuation Naval Authority regulates the provision of abandonment equipment and procedures for all Ministry of Defence Vessels. As such, it assures that access routes on board are evaluated early in the design process to maximize their efficiency and to eliminate, as far as possible, any congestion that might occur during escape. This analysis can be undertaken using a computer-based simulation for given escape scenarios and replicates the layout of the vessel and the interactions between each individual and the ship structure. One such software tool that facilitates this type of analysis is maritimeEXODUS. This tool, through large scale testing and validation, emulates human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. Hence there existed a clear requirement to understand the behaviour of well-trained naval personnel as opposed to civilian passengers and be able to model the fixtures and fittings that are exclusive to warships, thus allowing improvements to both maritimeEXODUS and other software products. Human factor trials using the Royal Navy training facilities at Whale Island, Portsmouth were recently undertaken to collect data that improves our understanding of the aforementioned differences. It is hoped that this data will form the basis of a long-term improvement package that will provide global validation of these simulation tools and assist in the development of specific Escape and Evacuation standards for warships. © 2005: Royal Institution of Naval Architects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the use of a blackboard architecture for building a hybrid case based reasoning (CBR) system. The Smartfire fire field modelling package has been built using this architecture and includes a CBR component. It allows the integration into the system of qualitative spatial reasoning knowledge from domain experts. The system can be used for the automatic set-up of fire field models. This enables fire safety practitioners who are not expert in modelling techniques to use a fire modelling tool. The paper discusses the integrating powers of the architecture, which is based on a common knowledge representation comprising a metric diagram and place vocabulary and mechanisms for adaptation and conflict resolution built on the Blackboard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[This abstract is based on the authors' abstract.]Three new standards to be applied when adopting commercial computer off-the-shelf (COTS) software solutions are discussed. The first standard is for a COTS software life cycle, the second for a software solution user requirements life cycle, and the third is a checklist to help in completing the requirements. The standards are based on recent major COTS software solution implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this chapter we look at JOSTLE, the multilevel graph-partitioning software package, and highlight some of the key research issues that it addresses. We first outline the core algorithms and place it in the context of the multilevel refinement paradigm. We then look at issues relating to its use as a tool for parallel processing and, in particular, partitioning in parallel. Since its first release in 1995, JOSTLE has been used for many mesh-based parallel scientific computing applications and so we also outline some enhancements such as multiphase mesh-partitioning, heterogeneous mapping and partitioning to optimise subdomain shape