986 resultados para modeling tools


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The software development industry is constantly evolving. The rise of the agile methodologies in the late 1990s, and new development tools and technologies require growing attention for everybody working within this industry. The organizations have, however, had a mixture of various processes and different process languages since a standard software development process language has not been available. A promising process meta-model called Software & Systems Process Engineering Meta- Model (SPEM) 2.0 has been released recently. This is applied by tools such as Eclipse Process Framework Composer, which is designed for implementing and maintaining processes and method content. Its aim is to support a broad variety of project types and development styles. This thesis presents the concepts of software processes, models, traditional and agile approaches, method engineering, and software process improvement. Some of the most well-known methodologies (RUP, OpenUP, OpenMethod, XP and Scrum) are also introduced with a comparison provided between them. The main focus is on the Eclipse Process Framework and SPEM 2.0, their capabilities, usage and modeling. As a proof of concept, I present a case study of modeling OpenMethod with EPF Composer and SPEM 2.0. The results show that the new meta-model and tool have made it possible to easily manage method content, publish versions with customized content, and connect project tools (such as MS Project) with the process content. The software process modeling also acts as a process improvement activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tutkimuksen päätavoite on arvioida, ovatko neljä ohjelmistovaihtoehtoa riittäviä tuotannon aikataulutuksen työkaluja ja mikä työkaluista sopii toimeksiantajayritykselle. Alatavoitteena on kuvata tuotannon aikataulutuksen nyky- ja tahtotila prosessimallinnuksen avulla, selvittää työkalun käyttäjätarpeet ja määritellä priorisoidut valintakriteerit työkalulle.Tutkimuksen teoriaosuudessa tutkitaan tuotannon aikataulutuksen logiikkaa ja haasteita. Työssä tarkastellaan aikataulutusohjelmiston valintaa rinnakkain prosessinmallinnuksen kanssa. Aikataulutusohjelmistovaihtoehdot ja metodit käyttäjätarpeiden selvittämiseksi käydään läpi. Empiriaosuudessa selvitetään tutkimuksen suhde toimeksiantajayrityksen strategiaan. Käyttäjätarpeet selvitetään haastattelujen avulla jaanalysoidaan QFD matriisin avulla. Toimeksiantajayrityksen tuotannon aikataulutuksen nyky- ja tahtotilaprosessit mallinnetaan, jotta ohjelmistojen sopivuutta, aikataulutusprosessia tukevana työkaluna voidaan arvioida.Tutkimustuloksena ovatpriorisoidut valintakriteerit aikataulutustyökalulle eli käyttäjätarpeista johdetut tärkeimmät toiminnalliset ominaisuudet, järjestelmätoimittaja-arvio sekä suositukset jatkotoimenpiteistä ja lisätutkimuksesta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A change in paradigm is needed in the prevention of toxic effects on the nervous system, moving from its present reliance solely on data from animal testing to a prediction model mostly based on in vitro toxicity testing and in silico modeling. According to the report published by the National Research Council (NRC) of the US National Academies of Science, high-throughput in vitro tests will provide evidence for alterations in"toxicity pathways" as the best possible method of large scale toxicity prediction. The challenges to implement this proposal are enormous, and provide much room for debate. While many efforts address the technical aspects of implementing the vision, many questions around it need also to be addressed. Is the overall strategy the only one to be pursued? How can we move from current to future paradigms? Will we ever be able to reliably model for chronic and developmental neurotoxicity in vitro? This paper summarizes four presentations from a symposium held at the International Neurotoxicology Conference held in Xi"an, China, in June 2011. A. Li reviewed the current guidelines for neurotoxicity and developmental neurotoxicity testing, and discussed the major challenges existing to realize the NCR vision for toxicity testing. J. Llorens reviewed the biology of mammalian toxic avoidance in view of present knowledge on the physiology and molecular biology of the chemical senses, taste and smell. This background information supports the hypothesis that relating in vivo toxicity to chemical epitope descriptors that mimic the chemical encoding performed by the olfactory system may provide a way to the long term future of complete in silico toxicity prediction. S. Ceccatelli reviewed the implementation of rodent and human neural stem cells (NSCs) as models for in vitro toxicity testing that measures parameters such as cell proliferation, differentiation and migration. These appear to be sensitive endpoints that can identify substances with developmental neurotoxic potential. C. Sun ol reviewed the use of primary neuronal cultures in testing for neurotoxicity of environmental pollutants, including the study of the effects of persistent exposures and/or in differentiating cells, which allow recording of effects that can be extrapolated to human developmental neurotoxicity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ohjelmiston kehitystyökalut käyttävät infromaatiota kehittäjän tuottamasta lähdekoodista. Informaatiota hyödynnetään ohjelmistoprojektin eri vaiheissa ja eri tarkoituksissa. Moderneissa ohjelmistoprojekteissa käytetyn informaation määrä voi kasvaa erittäin suureksi. Ohjelmistotyökaluilla on omat informaatiomallinsa ja käyttömekanisminsa. Informaation määrä sekä erilliset työkaluinformaatiomallit tekevät erittäin hankalaksi rakentaa joustavaa työkaluympäristöä, erityisesti ongelma-aluekohtaiseen ohjelmiston kehitysprosessiin. Tässä työssä on analysoitu perusinformaatiometamalleja Unified Modeling language kielestä, Python ohjelmointikielestä ja C++ ohjelmointikielestä. Metainformaation taso on rajoitettu rakenteelliselle tasolle. Ajettavat rakenteet on jätetty pois. ModelBase metamalli on yhdistetty olemassa olevista analysoiduista metamalleista. Tätä metamallia voidaan käyttää tulevaisuudessa ohjelmistotyökalujen kehitykseen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Severe combined immunodeficiency (SCID) and other severe non-SCID primary immunodeficiencies (non-SCID PID) can be treated by allogeneic hematopoietic stem cell (HSC) transplantation, but when histocompatibility leukocyte antigen-matched donors are lacking, this can be a high-risk procedure. Correcting the patient's own HSCs with gene therapy offers an attractive alternative. Gene therapies currently being used in clinical settings insert a functional copy of the entire gene by means of a viral vector. With this treatment, severe complications may result due to integration within oncogenes. A promising alternative is the use of endonucleases such as ZFNs, TALENs, and CRISPR/Cas9 to introduce a double-stranded break in the DNA and thus induce homology-directed repair. With these genome-editing tools a correct copy can be inserted in a precisely targeted "safe harbor." They can also be used to correct pathogenic mutations in situ and to develop cellular or animal models needed to study the pathogenic effects of specific genetic defects found in immunodeficient patients. This review discusses the advantages and disadvantages of these endonucleases in gene correction and modeling with an emphasis on CRISPR/Cas9, which offers the most promise due to its efficacy and versatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decades, calibration techniques have been widely used to improve the accuracy of robots and machine tools since they only involve software modification instead of changing the design and manufacture of the hardware. Traditionally, there are four steps are required for a calibration, i.e. error modeling, measurement, parameter identification and compensation. The objective of this thesis is to propose a method for the kinematics analysis and error modeling of a newly developed hybrid redundant robot IWR (Intersector Welding Robot), which possesses ten degrees of freedom (DOF) where 6-DOF in parallel and additional 4-DOF in serial. In this article, the problem of kinematics modeling and error modeling of the proposed IWR robot are discussed. Based on the vector arithmetic method, the kinematics model and the sensitivity model of the end-effector subject to the structure parameters is derived and analyzed. The relations between the pose (position and orientation) accuracy and manufacturing tolerances, actuation errors, and connection errors are formulated. Computer simulation is performed to examine the validity and effectiveness of the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a study which is aimed at building a knowledge model for a case company – business incubator “Ingria” (St. Petersburg, Russia). The business incubator is one of its kind organization in St. Petersburg, and one of the few in Russia, providing services for innovative entrepreneurial companies at an international level. Business incubation impact is deeply researched from the point of view of knowledge engineering. The paper also provides a broad analysis of various knowledge engineering tools used for visualization of knowledge, as well as knowledge modeling techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Linear programming models are effective tools to support initial or periodic planning of agricultural enterprises, requiring, however, technical coefficients that can be determined using computer simulation models. This paper, presented in two parts, deals with the development, application and tests of a methodology and of a computational modeling tool to support planning of irrigated agriculture activities. Part I aimed at the development and application, including sensitivity analysis, of a multiyear linear programming model to optimize the financial return and water use, at farm level for Jaíba irrigation scheme, Minas Gerais State, Brazil, using data on crop irrigation requirement and yield, obtained from previous simulation with MCID model. The linear programming model outputted a crop pattern to which a maximum total net present value of R$ 372,723.00 for the four years period, was obtained. Constraints on monthly water availability, labor, land and production were critical in the optimal solution. In relation to the water use optimization, it was verified that an expressive reductions on the irrigation requirements may be achieved by small reductions on the maximum total net present value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction between the soil and tillage tool can be examined using different parameters for the soil and the tool. Among the soil parameters are the shear stress, cohesion, internal friction angle of the soil and the pre-compression stress. The tool parameters are mainly the tool geometry and depth of operation. Regarding to the soils of Rio Grande do Sul there are hardly any studies and evaluations of the parameters that have importance in the use of mathematical models to predict tensile loads. The objective was to obtain parameters related to the soils of Rio Grande do Sul, which are used in soil-tool analysis, more specifically on mathematical models that allow the calculation of tractive effort for symmetric and narrow tools. Two of the main soils of Rio Grande do Sul, an Albaqualf and a Paleudult were studied. Equations that relate the cohesion, internal friction angle of the soil, adhesion, soil-tool friction angle and pre-compression stress as a function of water content in the soil were obtained, leading to important information for use of mathematical models for tractive effort calculation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The simulation programs are important tools to analyze the different energetic alternatives, including the use of renewable energy. The objective of this study was to analyze comparatively the different computer tools available for modeling of solar water heaters. Among the main simulation software of solar thermal systems, there are: RETScreen International, EnergyPlus, TRNSYS, SolDesigner, SolarPro, e T*SOL. Among the tools mentioned, only EnergyPlus and RETScreen International are free, but they allow obtaining interesting results when applied together. The first one has a detailed module of energy analysis of solar water heaters, while the second one provides an detailed economic feasibility study and an assessment of emissions of greenhouse gases. RETScreen International and EnergyPlus programs are aimed at a diverse audience, including designers, researchers and energy planners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The last decade has shown that the global paper industry needs new processes and products in order to reassert its position in the industry. As the paper markets in Western Europe and North America have stabilized, the competition has tightened. Along with the development of more cost-effective processes and products, new process design methods are also required to break the old molds and create new ideas. This thesis discusses the development of a process design methodology based on simulation and optimization methods. A bi-level optimization problem and a solution procedure for it are formulated and illustrated. Computational models and simulation are used to illustrate the phenomena inside a real process and mathematical optimization is exploited to find out the best process structures and control principles for the process. Dynamic process models are used inside the bi-level optimization problem, which is assumed to be dynamic and multiobjective due to the nature of papermaking processes. The numerical experiments show that the bi-level optimization approach is useful for different kinds of problems related to process design and optimization. Here, the design methodology is applied to a constrained process area of a papermaking line. However, the same methodology is applicable to all types of industrial processes, e.g., the design of biorefiners, because the methodology is totally generalized and can be easily modified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigated building information modeling (BIM) from a material supplier’s point of view. The objective was to gain understanding about how a building material supplier could benefit from the growing use of BIM in the AEC (architectural, engineering and construction) industry. Increasing amount of inquiries related to BIM from customers and other interest groups had awoken target company’s interest towards BIM. This thesis acts as a pre-study for the target company related to potential of BIM. First of all BIM and its meaning from a material supplier’s point of view was defined based on a literature review. To reveal the potential benefits of BIM for a material supplier a questionnaire survey and in total of 11 interviews were conducted. Based on the literature review and analyzed results it came clear that BIM offers benefits also for material suppliers. Product libraries and material databases for BIM tools can act as an important marketing channel for material suppliers. Material suppliers could also utilize the information from the BIM models to schedule their deliveries more precisely and potentially even to schedule their own production. All this needs deeper cooperation between material suppliers, contractors and other stakeholders in the AEC industry. Based on the results also first steps for the target company to utilize the growing use of BIM were defined.