894 resultados para Testing and Debugging
Resumo:
J.A. Ferreira Neto, E.C. Santos Junior, U. Fra Paleo, D. Miranda Barros, and M.C.O. Moreira. 2011. Optimal subdivision of land in agrarian reform projects: an analysis using genetic algorithms. Cien. Inv. Agr. 38(2): 169-178. The objective of this manuscript is to develop a new procedure to achieve optimal land subdivision using genetic algorithms (GA). The genetic algorithm was tested in the rural settlement of Veredas, located in Minas Gerais, Brazil. This implementation was based on the land aptitude and its productivity index. The sequence of tests in the study was carried out in two areas with eight different agricultural aptitude classes, including one area of 391.88 ha subdivided into 12 lots and another of 404.1763 ha subdivided into 14 lots. The effectiveness of the method was measured using the shunting line standard value of a parceled area lot`s productivity index. To evaluate each parameter, a sequence of 15 calculations was performed to record the best individual fitness average (MMI) found for each parameter variation. The best parameter combination found in testing and used to generate the new parceling with the GA was the following: 320 as the generation number, a population of 40 individuals, 0.8 mutation tax, and a 0.3 renewal tax. The solution generated rather homogeneous lots in terms of productive capacity.
Resumo:
In this paper, we discuss inferential aspects for the Grubbs model when the unknown quantity x (latent response) follows a skew-normal distribution, extending early results given in Arellano-Valle et al. (J Multivar Anal 96:265-281, 2005b). Maximum likelihood parameter estimates are computed via the EM-algorithm. Wald and likelihood ratio type statistics are used for hypothesis testing and we explain the apparent failure of the Wald statistics in detecting skewness via the profile likelihood function. The results and methods developed in this paper are illustrated with a numerical example.
Resumo:
The purpose of this study was to examine the extent to which incontinence aids, used in local authority/municipal nursing homes, were adapted to the resident’s urine-leakage volume and to find out how nurses perceived the current situation concerning individual testing of incontinence aids in municipal nursing homes. The study method was a quantitative empirical study and was carried out in two phases. The first phase was a weighing test, carried out in three nursing homes, whereby the incontinence aids used by 25 residents during a 48 hour period were weighed. The second phase was the completion of a questionnaire by the municipal nurses working in the same local authority. The questionnaire covered: the division of responsibilities; routines for testing incontinence aids, and the level of knowledge concerning individual incontinence aid testing. Only 22 % of the pads used during the observation were properly adapted to the patients’ urinary leakage volume, while 76 % of incontinence aids were larger than necessary in relation to the real volume of urinary leakage. The municipal nurses, who have a key role and formal responsibility for individual incontinence aid testing, considered that there was insufficient knowledge within the organisation concerning individual incontinence aid testing, and that the division of responsibilities in this area was unclear. There were great variations relating to the extent of the nurses’ involvement in individual incontinence aid testing, and the nurses stated that increased involvement in individual incontinence aid testing was dependent on more time being made available for this task. Only a minority of the nurses thought that the requisitioning of incontinence aids was preceded by individual testing of the incontinence pads within the organisation. The majority of nurses considered that this was not the case or were unsure of the situation.
Resumo:
In this project, two broad facets in the design of a methodology for performance optimization of indexable carbide inserts were examined. They were physical destructive testing and software simulation.For the physical testing, statistical research techniques were used for the design of the methodology. A five step method which began with Problem definition, through System identification, Statistical model formation, Data collection and Statistical analyses and results was indepthly elaborated upon. Set-up and execution of an experiment with a compression machine together with roadblocks and possible solution to curb road blocks to quality data collection were examined. 2k factorial design was illustrated and recommended for process improvement. Instances of first-order and second-order response surface analyses were encountered. In the case of curvature, test for curvature significance with center point analysis was recommended. Process optimization with method of steepest ascent and central composite design or process robustness studies of response surface analyses were also recommended.For the simulation test, AdvantEdge program was identified as the most used software for tool development. Challenges to the efficient application of this software were identified and possible solutions proposed. In conclusion, software simulation and physical testing were recommended to meet the objective of the project.
Resumo:
Since the last decade the problem of surface inspection has been receiving great attention from the scientific community, the quality control and the maintenance of products are key points in several industrial applications.The railway associations spent much money to check the railway infrastructure. The railway infrastructure is a particular field in which the periodical surface inspection can help the operator to prevent critical situations. The maintenance and monitoring of this infrastructure is an important aspect for railway association.That is why the surface inspection of railway also makes importance to the railroad authority to investigate track components, identify problems and finding out the way that how to solve these problems. In railway industry, usually the problems find in railway sleepers, overhead, fastener, rail head, switching and crossing and in ballast section as well. In this thesis work, I have reviewed some research papers based on AI techniques together with NDT techniques which are able to collect data from the test object without making any damage. The research works which I have reviewed and demonstrated that by adopting the AI based system, it is almost possible to solve all the problems and this system is very much reliable and efficient for diagnose problems of this transportation domain. I have reviewed solutions provided by different companies based on AI techniques, their products and reviewed some white papers provided by some of those companies. AI based techniques likemachine vision, stereo vision, laser based techniques and neural network are used in most cases to solve the problems which are performed by the railway engineers.The problems in railway handled by the AI based techniques performed by NDT approach which is a very broad, interdisciplinary field that plays a critical role in assuring that structural components and systems perform their function in a reliable and cost effective fashion. The NDT approach ensures the uniformity, quality and serviceability of materials without causing any damage of that materials is being tested. This testing methods use some way to test product like, Visual and Optical testing, Radiography, Magnetic particle testing, Ultrasonic testing, Penetrate testing, electro mechanic testing and acoustic emission testing etc. The inspection procedure has done periodically because of better maintenance. This inspection procedure done by the railway engineers manually with the aid of AI based techniques.The main idea of thesis work is to demonstrate how the problems can be reduced of thistransportation area based on the works done by different researchers and companies. And I have also provided some ideas and comments according to those works and trying to provide some proposal to use better inspection method where it is needed.The scope of this thesis work is automatic interpretation of data from NDT, with the goal of detecting flaws accurately and efficiently. AI techniques such as neural networks, machine vision, knowledge-based systems and fuzzy logic were applied to a wide spectrum of problems in this area. Another scope is to provide an insight into possible research methods concerning railway sleeper, fastener, ballast and overhead inspection by automatic interpretation of data.In this thesis work, I have discussed about problems which are arise in railway sleepers,fastener, and overhead and ballasted track. For this reason I have reviewed some research papers related with these areas and demonstrated how their systems works and the results of those systems. After all the demonstrations were taking place of the advantages of using AI techniques in contrast with those manual systems exist previously.This work aims to summarize the findings of a large number of research papers deploying artificial intelligence (AI) techniques for the automatic interpretation of data from nondestructive testing (NDT). Problems in rail transport domain are mainly discussed in this work. The overall work of this paper goes to the inspection of railway sleepers, fastener, ballast and overhead.
Resumo:
The purpose of this thesis is to show how to use vulnerability testing to identify and search for security flaws in networks of computers. The goal is partly to give a casual description of different types of methods of vulnerability testing and partly to present the method and results from a vulnerability test. A document containing the results of the vulnerability test will be handed over and a solution to the found high risk vulnerabilities. The goal is also to carry out and present this work as a form of a scholarly work.The problem was to show how to perform vulnerability tests and identify vulnerabilities in the organization's network and systems. Programs would be run under controlled circumstances in a way that they did not burden the network. Vulnerability tests were conducted sequentially, when data from the survey was needed to continue the scan.A survey of the network was done and data in the form of operating system, among other things, were collected in the tables. A number of systems were selected from the tables and were scanned with Nessus. The result was a table across the network and a table of found vulnerabilities. The table of vulnerabilities has helped the organization to prevent these vulnerabilities by updating the affected computers. Also a wireless network with WEP encryption, which is insecure, has been detected and decrypted.
Resumo:
At the beginning of 2003 the four year long research project REBUS on education, research, development and demonstration of competitive solar combisystems was launched. Research groups in Norway, Denmark, Sweden and Latvia are working together with partners from industry on innovative solutions for solar heating in the Nordic countries. Existing system concepts have been analyzed and based on the results new system designs have been developed. The proposed solutions have to fulfill country specific technical, sociological and cost requirements. Due to the similar demands on the systems in Denmark and Sweden it has been decided to develop a common system concept for both countries, which increases the market potential for the manufacturer. The focus of the development is on systems for the large number of rather well insulated existing single family houses. In close collaboration with the industrial partners a system concept has been developed that is characterized by its high compactness and flexibility. It allows the use of different types of boilers, heating distribution systems and a variable store and collector size. Two prototypes have been built, one for the Danish market with a gas boiler, and one for the Swedish market with a pellet boiler as auxiliary heater. After intensive testing and eventual further improvements at least two systems will be installed and monitored in demonstration houses. The systems have been modeled in TRNSYS and the simulation results will be used to further improve the system and evaluate the system performance.
Resumo:
The report examines the factors which may be a contributing cause to the problems that are present when ferritic stainless steel are eddy current tested in a warm condition. The work is carried out at Fagersta Stainless AB in Fagersta which manufactures stainless steel wire. In the rolling mill there is an eddy current equipment for detection of surface defects on the wire. The ferritic stainless steels cause a noise when testing and this noise complicates the detection of defects.Because of this, a study was made of how the noise related to factors such as steel grade, temperature, size and velocity. By observing the signal and with the possibilities to change the equipment settings the capability to let a signal filter reduce the noise level were evaluated. Theories about the material's physical properties have also been included, mainly the magnetic properties, electrical conductivity and the material's tendency to oxidize.Results from the tests show that a number of factors do not affect the inductive test significantly and to use a filter to reduce the noise level does not seem to be a viable option. The level of noise does not relate to the presence of superficial particles in form of oxides.The ferritic stainless steels showed some difference in noise level. Which noise level there was did match well with the steels probability for a precipitation of a second phase, and precipitation of austenite may in this case contribute to noise when using an eddy current instrument.The noise is probably due to some physical material property that varies within the thread.
Resumo:
This thesis consists of four manuscripts in the area of nonlinear time series econometrics on topics of testing, modeling and forecasting nonlinear common features. The aim of this thesis is to develop new econometric contributions for hypothesis testing and forecasting in these area. Both stationary and nonstationary time series are concerned. A definition of common features is proposed in an appropriate way to each class. Based on the definition, a vector nonlinear time series model with common features is set up for testing for common features. The proposed models are available for forecasting as well after being well specified. The first paper addresses a testing procedure on nonstationary time series. A class of nonlinear cointegration, smooth-transition (ST) cointegration, is examined. The ST cointegration nests the previously developed linear and threshold cointegration. An Ftypetest for examining the ST cointegration is derived when stationary transition variables are imposed rather than nonstationary variables. Later ones drive the test standard, while the former ones make the test nonstandard. This has important implications for empirical work. It is crucial to distinguish between the cases with stationary and nonstationary transition variables so that the correct test can be used. The second and the fourth papers develop testing approaches for stationary time series. In particular, the vector ST autoregressive (VSTAR) model is extended to allow for common nonlinear features (CNFs). These two papers propose a modeling procedure and derive tests for the presence of CNFs. Including model specification using the testing contributions above, the third paper considers forecasting with vector nonlinear time series models and extends the procedures available for univariate nonlinear models. The VSTAR model with CNFs and the ST cointegration model in the previous papers are exemplified in detail,and thereafter illustrated within two corresponding macroeconomic data sets.