974 resultados para Software packages selection


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Markov chain Monte Carlo (MCMC) is a methodology that is gaining widespread use in the phylogenetics community and is central to phylogenetic software packages such as MrBayes. An important issue for users of MCMC methods is how to select appropriate values for adjustable parameters such as the length of the Markov chain or chains, the sampling density, the proposal mechanism, and, if Metropolis-coupled MCMC is being used, the number of heated chains and their temperatures. Although some parameter settings have been examined in detail in the literature, others are frequently chosen with more regard to computational time or personal experience with other data sets. Such choices may lead to inadequate sampling of tree space or an inefficient use of computational resources. We performed a detailed study of convergence and mixing for 70 randomly selected, putatively orthologous protein sets with different sizes and taxonomic compositions. Replicated runs from multiple random starting points permit a more rigorous assessment of convergence, and we developed two novel statistics, delta and epsilon, for this purpose. Although likelihood values invariably stabilized quickly, adequate sampling of the posterior distribution of tree topologies took considerably longer. Our results suggest that multimodality is common for data sets with 30 or more taxa and that this results in slow convergence and mixing. However, we also found that the pragmatic approach of combining data from several short, replicated runs into a metachain to estimate bipartition posterior probabilities provided good approximations, and that such estimates were no worse in approximating a reference posterior distribution than those obtained using a single long run of the same length as the metachain. Precision appears to be best when heated Markov chains have low temperatures, whereas chains with high temperatures appear to sample trees with high posterior probabilities only rarely. [Bayesian phylogenetic inference; heating parameter; Markov chain Monte Carlo; replicated chains.]

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most widely-used computer software packages, such as word processors, spreadsheets and web browsers, incorporate comprehensive help systems, partly because the software is meant for those with little technical knowledge. This paper identifies four systematic philosophies or approaches to help system delivery, namely the documentation approach, based on written documents, either paper-based or online; the training approach, either offered before the user starts working on the software or on-the-job; intelligent help, that is online, context-sensitive help or that relying on software agents; and finally an approach based on minimalism, defined as providing help only when and where it is needed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The widespread implementation of Manufacturing Resource Planning (MRPII) systems in this country and abroad and the reported dissatisfaction with their use formed the initial basis of this piece of research which concentrates on the fundamental theory and design of the Closed Loop MRPII system itself. The dissertation concentrates on two key aspects namely; how Master Production Scheduling is carried out in differing business environments and how well the `closing of the loop' operates by checking the capcity requirements of the different levels of plans within an organisation. The main hypothesis which is tested is that in U.K. manufacturing industry, resource checks are either not being carried out satisfactorily or they are not being fed back to the appropriate plan in a timely fashion. The research methodology employed involved initial detailed investigations into Master Scheduling and capacity planning in eight diverse manufacturing companies. This was followed by a nationwide survey of users in 349 companies, a survey of all the major suppliers of Production Management software in the U.K. and an analysis of the facilities offered by current software packages. The main conclusion which is drawn is that the hypothesis is proved in the majority of companies in that only just over 50% of companies are attempting Resource and Capacity Planning and only 20% are successfully feeding back CRP information to `close the loop'. Various causative factors are put forward and remedies are suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Product design decisions can have a significant impact on the financial and operation performance of manufacturing companies. Therefore good analysis of the financial impact of design decisions is required if the profitability of the business is to be maximised. The product design process can be viewed as a chain of decisions which links decisions about the concept to decisions about the detail. The idea of decision chains can be extended to include the design and operation of the 'downstream' business processes which manufacture and support the product. These chains of decisions are not independent but are interrelated in a complex manner. To deal with the interdependencies requires a modelling approach which represents all the chains of decisions, to a level of detail not normally considered in the analysis of product design. The operational, control and financial elements of a manufacturing business constitute a dynamic system. These elements interact with each other and with external elements (i.e. customers and suppliers). Analysing the chain of decisions for such an environment requires the application of simulation techniques, not just to any one area of interest, but to the whole business i.e. an enterprise simulation. To investigate the capability and viability of enterprise simulation an experimental 'Whole Business Simulation' system has been developed. This system combines specialist simulation elements and standard operational applications software packages, to create a model that incorporates all the key elements of a manufacturing business, including its customers and suppliers. By means of a series of experiments, the performance of this system was compared with a range of existing analysis tools (i.e. DFX, capacity calculation, shop floor simulator, and business planner driven by a shop floor simulator).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Replacement of the traditional coil spring with one of more fibre-reinforced plastic sulcated springs is a future possibility. Spring designers of metallic coil springs have design formulae readily available, and software packages specific to coil spring design exist. However, the sulcated spring is at the prototype stage of development, so literature on these springs is very sparse. The thesis contains information on the market for sulcated springs, and their advantages and disadvantages. Literature on other types of fibre reinforced plastic springs has also been reviewed. Design software has been developed for the sulcated spring along similar lines to coil spring design software. In order to develop the software, a theoretical model had to be developed which formed the mathematical basis for the software. The theoretical model is based on a choice of four methods for calculating the flexural rigidity; beam theory, plate theory, and lamination theory assuming isotropic and orthoropic material properties. Experimental results for strain and spring stiffness have been compared with the theoretical model, and were in good agreement. Included in the design software are the results of experimental work on fatigue, and design limiting factors to prevent or warn against impractical designs. Finite element analysis has been used to verify the theoretical model developed, and to find the better approximation to the experimental results. Applications and types of assemblies for the sulcated spring were discussed. Sulcated spring designs for the automotive applications of a suspension, clutch and engine valve spring were found using the design computer software. These sulcated spring designs were within or close to the space of the existing coil spring and yield the same performance. Finally the commercial feasibility of manufacturing the sulcated spring was assessed and compared with the coil spring, to evaluate the plausibility of the sulcated spring replacing the coil spring eventually.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper describes a first supercomputer cluster project in Ukraine, its hardware, software and characteristics. The paper shows the performance results received on systems that were built. There are also shortly described software packages made by cluster users that have already made a return of investments into a cluster project.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper describes three software packages - the main components of a software system for processing and web-presentation of Bulgarian language resources – parallel corpora and bilingual dictionaries. The author briefly presents current versions of the core components “Dictionary” and “Corpus” as well as the recently developed component “Connection” that links both “Dictionary” and “Corpus”. The components main functionalities are described as well. Some examples of the usage of the system’s web-applications are included.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the latest development in computer science, multivariate data analysis methods became increasingly popular among economists. Pattern recognition in complex economic data and empirical model construction can be more straightforward with proper application of modern softwares. However, despite the appealing simplicity of some popular software packages, the interpretation of data analysis results requires strong theoretical knowledge. This book aims at combining the development of both theoretical and applicationrelated data analysis knowledge. The text is designed for advanced level studies and assumes acquaintance with elementary statistical terms. After a brief introduction to selected mathematical concepts, the highlighting of selected model features is followed by a practice-oriented introduction to the interpretation of SPSS1 outputs for the described data analysis methods. Learning of data analysis is usually time-consuming and requires efforts, but with tenacity the learning process can bring about a significant improvement of individual data analysis skills.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The civil jury has been under attack in recent years for being unreliable and incompetent. Considering the myriad causes for poor civil juror decision-making, the current investigation explores both procedural and evidentiary issues that impact juror's decisions. Specifically, the first phase of this dissertation examines how jurors (mis)use evidence pertaining to the litigants when determining liability and awarding damages. After investigating how jurors utilize evidence, the focus shifts to exploring the utility of procedural reforms designed to improve decision-making (specifically revising the instructions on the laws in the case and bifurcating the damage phases of the trial). Using the results from the first two phases of the research, the final study involves manipulating pieces of evidence related to the litigants while exploring the effects that revising the judicial instructions have on the utilization of evidence in particular and on decision-making in general. ^ This dissertation was run on-line, allowing participants to access the study materials at their convenience. After giving consent, participants read the scenario of a fictitious product liability case with the litigant manipulations incorporated into the summary. Participants answered several attitudinal, case-specific, and comprehension questions, and were instructed to find in favor of one side and award any damages they felt warranted. Exploratory factor analyses, Probit and linear regressions, and path analyses were used to analyze the data (M-plus and SPSS were the software packages used to conduct the analyses). Results indicated that misuse of evidence was fairly frequent, though the mock jurors also utilized evidence appropriately. Although the results did not support bifurcation as a viable procedural reform, revising the judicial instructions did significantly increase comprehension rates. Trends in the data suggested that better decision-making occurred when the revised instructions were used, thus providing empirical support for this procedural reform as a means of improving civil jury decision-making. Implications for actual trials and attorneys are discussed. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

During the past three decades, the use of roundabouts has increased throughout the world due to their greater benefits in comparison with intersections controlled by traditional means. Roundabouts are often chosen because they are widely associated with low accident rates, lower construction and operating costs, and reasonable capacities and delay. ^ In the planning and design of roundabouts, special attention should be given to the movement of pedestrians and bicycles. As a result, there are several guidelines for the design of pedestrian and bicycle treatments at roundabouts that increase the safety of both pedestrians and bicyclists at existing and proposed roundabout locations. Different design guidelines have differing criteria for handling pedestrians and bicyclists at roundabout locations. Although all of the investigated guidelines provide better safety (depending on the traffic conditions at a specific location), their effects on the performance of the roundabout have not been examined yet. ^ Existing roundabout analysis software packages provide estimates of capacity and performance characteristics. This includes characteristics such as delay, queue lengths, stop rates, effects of heavy vehicles, crash frequencies, and geometric delays, as well as fuel consumption, pollutant emissions and operating costs for roundabouts. None of these software packages, however, are capable of determining the effects of various pedestrian crossing locations, nor the effect of different bicycle treatments on the performance of roundabouts. ^ The objective of this research is to develop simulation models capable of determining the effect of various pedestrian and bicycle treatments at single-lane roundabouts. To achieve this, four models were developed. The first model simulates a single-lane roundabout without bicycle and pedestrian traffic. The second model simulates a single-lane roundabout with a pedestrian crossing and mixed flow bicyclists. The third model simulates a single-lane roundabout with a combined pedestrian and bicycle crossing, while the fourth model simulates a single-lane roundabout with a pedestrian crossing and a bicycle lane at the outer perimeter of the roundabout for the bicycles. Traffic data was collected at a modern roundabout in Boca Raton, Florida. ^ The results of this effort show that installing a pedestrian crossing on the roundabout approach will have a negative impact on the entry flow, while the downstream approach will benefit from the newly created gaps by pedestrians. Also, it was concluded that a bicycle lane configuration is more beneficial for all users of the roundabout instead of the mixed flow or combined crossing. Installing the pedestrian crossing at one-car length is more beneficial for pedestrians than two- and three-car lengths. Finally, it was concluded that the effect of the pedestrian crossing on the vehicle queues diminishes as the distance between the crossing and the roundabout increases. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective in this work is to build a rapid and automated numerical design method that makes optimal design of robots possible. In this work, two classes of optimal robot design problems were specifically addressed: (1) When the objective is to optimize a pre-designed robot, and (2) when the goal is to design an optimal robot from scratch. In the first case, to reach the optimum design some of the critical dimensions or specific measures to optimize (design parameters) are varied within an established range. Then the stress is calculated as a function of the design parameter(s), the design parameter(s) that optimizes a pre-determined performance index provides the optimum design. In the second case, this work focuses on the development of an automated procedure for the optimal design of robotic systems. For this purpose, Pro/Engineer© and MatLab© software packages are integrated to draw the robot parts, optimize them, and then re-draw the optimal system parts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Computed tomography (CT) is a valuable technology to the healthcare enterprise as evidenced by the more than 70 million CT exams performed every year. As a result, CT has become the largest contributor to population doses amongst all medical imaging modalities that utilize man-made ionizing radiation. Acknowledging the fact that ionizing radiation poses a health risk, there exists the need to strike a balance between diagnostic benefit and radiation dose. Thus, to ensure that CT scanners are optimally used in the clinic, an understanding and characterization of image quality and radiation dose are essential.

The state-of-the-art in both image quality characterization and radiation dose estimation in CT are dependent on phantom based measurements reflective of systems and protocols. For image quality characterization, measurements are performed on inserts imbedded in static phantoms and the results are ascribed to clinical CT images. However, the key objective for image quality assessment should be its quantification in clinical images; that is the only characterization of image quality that clinically matters as it is most directly related to the actual quality of clinical images. Moreover, for dose estimation, phantom based dose metrics, such as CT dose index (CTDI) and size specific dose estimates (SSDE), are measured by the scanner and referenced as an indicator for radiation exposure. However, CTDI and SSDE are surrogates for dose, rather than dose per-se.

Currently there are several software packages that track the CTDI and SSDE associated with individual CT examinations. This is primarily the result of two causes. The first is due to bureaucracies and governments pressuring clinics and hospitals to monitor the radiation exposure to individuals in our society. The second is due to the personal concerns of patients who are curious about the health risks associated with the ionizing radiation exposure they receive as a result of their diagnostic procedures.

An idea that resonates with clinical imaging physicists is that patients come to the clinic to acquire quality images so they can receive a proper diagnosis, not to be exposed to ionizing radiation. Thus, while it is important to monitor the dose to patients undergoing CT examinations, it is equally, if not more important to monitor the image quality of the clinical images generated by the CT scanners throughout the hospital.

The purposes of the work presented in this thesis are threefold: (1) to develop and validate a fully automated technique to measure spatial resolution in clinical CT images, (2) to develop and validate a fully automated technique to measure image contrast in clinical CT images, and (3) to develop a fully automated technique to estimate radiation dose (not surrogates for dose) from a variety of clinical CT protocols.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

© 2016 Springer Science+Business Media New YorkResearchers studying mammalian dentitions from functional and adaptive perspectives increasingly have moved towards using dental topography measures that can be estimated from 3D surface scans, which do not require identification of specific homologous landmarks. Here we present molaR, a new R package designed to assist researchers in calculating four commonly used topographic measures: Dirichlet Normal Energy (DNE), Relief Index (RFI), Orientation Patch Count (OPC), and Orientation Patch Count Rotated (OPCR) from surface scans of teeth, enabling a unified application of these informative new metrics. In addition to providing topographic measuring tools, molaR has complimentary plotting functions enabling highly customizable visualization of results. This article gives a detailed description of the DNE measure, walks researchers through installing, operating, and troubleshooting molaR and its functions, and gives an example of a simple comparison that measured teeth of the primates Alouatta and Pithecia in molaR and other available software packages. molaR is a free and open source software extension, which can be found at the doi:10.13140/RG.2.1.3563.4961(molaR v. 2.0) as well as on the Internet repository CRAN, which stores R packages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Laser scanning is a terrestrial laser-imaging system that creates highly accurate three-dimensional images of objects for use in standard computer-aided design software packages. This report describes results of a pilot study to investigate the use of laser scanning for transportation applications in Iowa. After an initial training period on the use of the scanner and Cyclone software, pilot tests were performed on the following projects: intersection and railroad bridge for training purposes; section of highway to determine elevation accuracy and pair of bridges to determine level of detail that can be captured; new concrete pavement to determine smoothness; bridge beams to determine camber for deck-loading calculations; stockpile to determine volume; and borrow pit to determine volume. Results show that it is possible to obtain 2-6 mm precision with the laser scanner as claimed by the manufacturer compared to approximately one-inch precision with aerial photogrammetry using a helicopter. A cost comparison between helicopter photogrammetry and laser scanning showed that laser scanning was approximately 30 percent higher in cost depending on assumptions. Laser scanning can become more competitive to helicopter photogrammetry by elevating the scanner on a boom truck and capturing both sides of a divided roadway at the same time. Two- and three-dimensional drawings were created in MicroStation for one of the scanned highway bridges. It was demonstrated that it is possible to create such drawings within the accuracy of this technology. It was discovered that a significant amount of time is necessary to convert point cloud images into drawings. As this technology matures, this task should become less time consuming. It appears that laser scanning technology does indeed have a place in the Iowa Department of Transportation design and construction toolbox. Based on results from this study, laser scanning can be used cost effectively for preliminary surveys to develop TIN meshes of roadway surfaces. It also appears that this technique can be used quite effectively to measure bridge beam camber in a safer and quicker fashion compared to conventional approaches. Volume calculations are also possible using laser scanning. It seems that measuring quantities of rock could be an area where this technology would be quite beneficial since accuracy is more important with this material compared to soil. Other applications for laser scanning could include developing as-built drawings of historical structures such as the bridges of Madison County. This technology could also be useful where safety is a concern such as accurately measuring the surface of a highway active with traffic or scanning the underside of a bridge damaged by a truck. It is recommended that the Iowa Department of Transportation initially rent the scanner when it is needed and purchase the software. With time, it may be cost justifiable to purchase the scanner as well. Laser scanning consultants can be hired as well but at a higher cost.