945 resultados para Mathematical Techniques - Integration


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Il existe désormais une grande variété de lentilles panoramiques disponibles sur le marché dont certaines présentant des caractéristiques étonnantes. Faisant partie de cette dernière catégorie, les lentilles Panomorphes sont des lentilles panoramiques anamorphiques dont le profil de distorsion est fortement non-uniforme, ce qui cause la présence de zones de grandissement augmenté dans le champ de vue. Dans un contexte de robotique mobile, ces particularités peuvent être exploitées dans des systèmes stéréoscopiques pour la reconstruction 3D d’objets d’intérêt qui permettent à la fois une bonne connaissance de l’environnement, mais également l’accès à des détails plus fins en raison des zones de grandissement augmenté. Cependant, à cause de leur complexité, ces lentilles sont difficiles à calibrer et, à notre connaissance, aucune étude n’a réellement été menée à ce propos. L’objectif principal de cette thèse est la conception, l’élaboration et l’évaluation des performances de systèmes stéréoscopiques Panomorphes. Le calibrage a été effectué à l’aide d’une technique établie utilisant des cibles planes et d’une boîte à outils de calibrage dont l’usage est répandu. De plus, des techniques mathématiques nouvelles visant à rétablir la symétrie de révolution dans l’image (cercle) et à uniformiser la longueur focale (cercle uniforme) ont été développées pour voir s’il était possible d’ainsi faciliter le calibrage. Dans un premier temps, le champ de vue a été divisé en zones à l’intérieur desquelles la longueur focale instantanée varie peu et le calibrage a été effectué pour chacune d’entre elles. Puis, le calibrage général des systèmes a aussi été réalisé pour tout le champ de vue simultanément. Les résultats ont montré que la technique de calibrage par zone ne produit pas de gain significatif quant à la qualité des reconstructions 3D d’objet d’intérêt par rapport au calibrage général. Cependant, l’étude de cette nouvelle approche a permis de réaliser une évaluation des performances des systèmes stéréoscopiques Panomorphes sur tout le champ de vue et de montrer qu’il est possible d’effectuer des reconstructions 3D de qualité dans toutes les zones. De plus, la technique mathématique du cercle a produit des résultats de reconstructions 3D en général équivalents à l’utilisation des coordonnées originales. Puisqu’il existe des outils de calibrage qui, contrairement à celui utilisé dans ce travail, ne disposent que d’un seul degré de liberté sur la longueur focale, cette technique pourrait rendre possible le calibrage de lentilles Panomorphes à l’aide de ceux-ci. Finalement, certaines conclusions ont pu être dégagées quant aux facteurs déterminants influençant la qualité de la reconstruction 3D à l’aide de systèmes stéréoscopiques Panomorphes et aux caractéristiques à privilégier dans le choix des lentilles. La difficulté à calibrer les optiques Panomorphes en laboratoire a mené à l’élaboration d’une technique de calibrage virtuel utilisant un logiciel de conception optique et une boîte à outils de calibrage. Cette approche a permis d’effectuer des simulations en lien avec l’impact des conditions d’opération sur les paramètres de calibrage et avec l’effet des conditions de calibrage sur la qualité de la reconstruction. Des expérimentations de ce type sont pratiquement impossibles à réaliser en laboratoire mais représentent un intérêt certain pour les utilisateurs. Le calibrage virtuel d’une lentille traditionnelle a aussi montré que l’erreur de reprojection moyenne, couramment utilisée comme façon d’évaluer la qualité d’un calibrage, n’est pas nécessairement un indicateur fiable de la qualité de la reconstruction 3D. Il est alors nécessaire de disposer de données supplémentaires pour juger adéquatement de la qualité d’un calibrage.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cada vez mais, os principais objetivos na indústria é a produção a baixo custo, com a máxima qualidade e com o tempo de fabrico o mais curto possível. Para atingir esta meta, a indústria recorre, frequentemente, às máquinas de comando numérico (CNC), uma vez que com esta tecnologia torna se capaz alcançar uma elevada precisão e um tempo de processamento mais baixo. As máquinas ferramentas CNC podem ser aplicadas em diferentes processos de maquinagem, tais como: torneamento, fresagem, furação, entre outros. De todos estes processos, o mais utilizado é a fresagem devido à sua versatilidade. Utiliza-se normalmente este processo para maquinar materiais metálicos como é o caso do aço e dos ferros fundidos. Neste trabalho, são analisados os efeitos da variação de quatro parâmetros no processo de fresagem (velocidade de corte, velocidade de avanço, penetração radial e penetração axial), individualmente e a interação entre alguns deles, na variação da rugosidade num aço endurecido (aço 12738). Para essa análise são utilizados dois métodos de otimização: o método de Taguchi e o método das superfícies. O primeiro método foi utilizado para diminuir o número de combinações possíveis e, consequentemente, o número de ensaios a realizar é denominado por método de Taguchi. O método das superfícies ou método das superfícies de resposta (RSM) foi utilizado com o intuito de comparar os resultados obtidos com o método de Taguchi, de acordo com alguns trabalhos referidos na bibliografia especializada, o RSM converge mais rapidamente para um valor ótimo. O método de Taguchi é muito conhecido no setor industrial onde é utilizado para o controlo de qualidade. Apresenta conceitos interessantes, tais como robustez e perda de qualidade, sendo bastante útil para identificar variações do sistema de produção, durante o processo industrial, quantificando a variação e permitindo eliminar os fatores indesejáveis. Com este método foi vi construída uma matriz ortogonal L16 e para cada parâmetro foram definidos dois níveis diferentes e realizados dezasseis ensaios. Após cada ensaio, faz-se a medição superficial da rugosidade da peça. Com base nos resultados obtidos das medições da rugosidade é feito um tratamento estatístico dos dados através da análise de variância (Anova) a fim de determinar a influência de cada um dos parâmetros na rugosidade superficial. Verificou-se que a rugosidade mínima medida foi de 1,05m. Neste estudo foi também determinada a contribuição de cada um dos parâmetros de maquinagem e a sua interação. A análise dos valores de “F-ratio” (Anova) revela que os fatores mais importantes são a profundidade de corte radial e da interação entre profundidade de corte radial e profundidade de corte axial para minimizar a rugosidade da superfície. Estes têm contribuições de cerca de 30% e 24%, respetivamente. Numa segunda etapa este mesmo estudo foi realizado pelo método das superfícies, a fim de comparar os resultados por estes dois métodos e verificar qual o melhor método de otimização para minimizar a rugosidade. A metodologia das superfícies de resposta é baseada num conjunto de técnicas matemáticas e estatísticas úteis para modelar e analisar problemas em que a resposta de interesse é influenciada por diversas variáveis e cujo objetivo é otimizar essa resposta. Para este método apenas foram realizados cinco ensaios, ao contrário de Taguchi, uma vez que apenas em cinco ensaios consegue-se valores de rugosidade mais baixos do que a média da rugosidade no método de Taguchi. O valor mais baixo por este método foi de 1,03μm. Assim, conclui-se que RSM é um método de otimização mais adequado do que Taguchi para os ensaios realizados. Foram obtidos melhores resultados num menor número de ensaios, o que implica menos desgaste da ferramenta, menor tempo de processamento e uma redução significativa do material utilizado.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Previous work by Professor John Frazer on Evolutionary Architecture provides a basis for the development of a system evolving architectural envelopes in a generic and abstract manner. Recent research by the authors has focused on the implementation of a virtual environment for the automatic generation and exploration of complex forms and architectural envelopes based on solid modelling techniques and the integration of evolutionary algorithms, enhanced computational and mathematical models. Abstract data types are introduced for genotypes in a genetic algorithm order to develop complex models using generative and evolutionary computing techniques. Multi-objective optimisation techniques are employed for defining the fitness function in the evaluation process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Inverse problems based on using experimental data to estimate unknown parameters of a system often arise in biological and chaotic systems. In this paper, we consider parameter estimation in systems biology involving linear and non-linear complex dynamical models, including the Michaelis–Menten enzyme kinetic system, a dynamical model of competence induction in Bacillus subtilis bacteria and a model of feedback bypass in B. subtilis bacteria. We propose some novel techniques for inverse problems. Firstly, we establish an approximation of a non-linear differential algebraic equation that corresponds to the given biological systems. Secondly, we use the Picard contraction mapping, collage methods and numerical integration techniques to convert the parameter estimation into a minimization problem of the parameters. We propose two optimization techniques: a grid approximation method and a modified hybrid Nelder–Mead simplex search and particle swarm optimization (MH-NMSS-PSO) for non-linear parameter estimation. The two techniques are used for parameter estimation in a model of competence induction in B. subtilis bacteria with noisy data. The MH-NMSS-PSO scheme is applied to a dynamical model of competence induction in B. subtilis bacteria based on experimental data and the model for feedback bypass. Numerical results demonstrate the effectiveness of our approach.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In Australia, railway systems play a vital role in transporting the sugarcane crop from farms to mills. The sugarcane transport system is very complex and uses daily schedules, consisting of a set of locomotives runs, to satisfy the requirements of the mill and harvesters. The total cost of sugarcane transport operations is very high; over 35% of the total cost of sugarcane production in Australia is incurred in cane transport. Efficient schedules for sugarcane transport can reduce the cost and limit the negative effects that this system can have on the raw sugar production system. There are several benefits to formulating the train scheduling problem as a blocking parallel-machine job shop scheduling (BPMJSS) problem, namely to prevent two trains passing in one section at the same time; to keep the train activities (operations) in sequence during each run (trip) by applying precedence constraints; to pass the trains on one section in the correct order (priorities of passing trains) by applying disjunctive constraints; and, to ease passing trains by solving rail conflicts by applying blocking constraints and Parallel Machine Scheduling. Therefore, the sugarcane rail operations are formulated as BPMJSS problem. A mixed integer programming and constraint programming approaches are used to describe the BPMJSS problem. The model is solved by the integration of constraint programming, mixed integer programming and search techniques. The optimality performance is tested by Optimization Programming Language (OPL) and CPLEX software on small and large size instances based on specific criteria. A real life problem is used to verify and validate the approach. Constructive heuristics and new metaheuristics including simulated annealing and tabu search are proposed to solve this complex and NP-hard scheduling problem and produce a more efficient scheduling system. Innovative hybrid and hyper metaheuristic techniques are developed and coded using C# language to improve the solutions quality and CPU time. Hybrid techniques depend on integrating heuristic and metaheuristic techniques consecutively, while hyper techniques are the complete integration between different metaheuristic techniques, heuristic techniques, or both.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A number of mathematical models investigating certain aspects of the complicated process of wound healing are reported in the literature in recent years. However, effective numerical methods and supporting error analysis for the fractional equations which describe the process of wound healing are still limited. In this paper, we consider the numerical simulation of a fractional mathematical model of epidermal wound healing (FMM-EWH), which is based on the coupled advection-diffusion equations for cell and chemical concentration in a polar coordinate system. The space fractional derivatives are defined in the Left and Right Riemann-Liouville sense. Fractional orders in the advection and diffusion terms belong to the intervals (0, 1) or (1, 2], respectively. Some numerical techniques will be used. Firstly, the coupled advection-diffusion equations are decoupled to a single space fractional advection-diffusion equation in a polar coordinate system. Secondly, we propose a new implicit difference method for simulating this equation by using the equivalent of Riemann-Liouville and Grünwald-Letnikov fractional derivative definitions. Thirdly, its stability and convergence are discussed, respectively. Finally, some numerical results are given to demonstrate the theoretical analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the last 30 years, numerous research groups have attempted to provide mathematical descriptions of the skin wound healing process. The development of theoretical models of the interlinked processes that underlie the healing mechanism has yielded considerable insight into aspects of this critical phenomenon that remain difficult to investigate empirically. In particular, the mathematical modeling of angiogenesis, i.e., capillary sprout growth, has offered new paradigms for the understanding of this highly complex and crucial step in the healing pathway. With the recent advances in imaging and cell tracking, the time is now ripe for an appraisal of the utility and importance of mathematical modeling in wound healing angiogenesis research. The purpose of this review is to pedagogically elucidate the conceptual principles that have underpinned the development of mathematical descriptions of wound healing angiogenesis, specifically those that have utilized a continuum reaction-transport framework, and highlight the contribution that such models have made toward the advancement of research in this field. We aim to draw attention to the common assumptions made when developing models of this nature, thereby bringing into focus the advantages and limitations of this approach. A deeper integration of mathematical modeling techniques into the practice of wound healing angiogenesis research promises new perspectives for advancing our knowledge in this area. To this end we detail several open problems related to the understanding of wound healing angiogenesis, and outline how these issues could be addressed through closer cross-disciplinary collaboration.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Adenoviral vectors are currently the most widely used gene therapeutic vectors, but their inability to integrate into host chromosomal DNA shortened their transgene expression and limited their use in clinical trials. In this project, we initially planned to develop a technique to test the effect of the early region 1 (E1) on adenovirus integration by comparing the integration efficiencies between an E1-deleted adenoviral vector (SubE1) and an Elcontaining vector (SubE3). However, we did not harvest any SubE3 virus, even if we repeated the transfection and successfully rescued the SubE1 virus (2/4 transfections generated viruses) and positive control virus (6/6). The failure of rescuing SubE3 could be caused by the instability of the genomic plasmid pFG173, as it had frequent intemal deletions when we were purifying It. Therefore, we developed techniques to test the effect of E1 on homologous recombination (HR) since literature suggested that adenovirus integration is initiated by HR. We attempted to silence the E1 in 293 cells by transfecting E1A/B-specific small interfering RNA (siRNA). However, no silenced phenotype was observed, even if we varied the concentrations of E1A/B siRNA (from 30 nM to 270 nM) and checked the silencing effects at different time points (48, 72, 96 h). One possible explanation would be that the E1A/B siRNA sequences are not potent enough to Induce the silenced phenotype. For evaluating HR efficiencies, an HR assay system based on bacterial transfonmatJon was designed. We constmcted two plasmids ( designated as pUC19-dl1 and pUC19-dl2) containing different defective lacZa cassettes (forming white colonies after transformation) that can generate a functional lacZa cassette (forming blue colonies) through HR after transfecting into 293 cells. The HR efficiencies would be expressed as the percentages of the blue colonies among all the colonies. Unfortunately, after transfonnation of plasmid isolated from 293 cells, no colony was found, even at a transformation efficiency of 1.8x10^ colonies/pg pUC19, suggesting the sensitivity of this system was low. To enhance the sensitivity, PCR was used. We designed a set of primers that can only amplify the recombinant plasmid fomied through HR. Therefore, the HR efficiencies among different treatments can be evaluated by the amplification results, and this system could be used to test the effect of E1 region on adenovirus integration. In addition, to our knowledge there was no previous studies using PCR/ Realtime PCR to evaluate HR efficiency, so this system also provides a PCR-based method to carry out the HR assays.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasingly complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I) reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develops conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to building simulation scientists, initiates a dialogue and builds bridges between scientists and engineers, and stimulates future research about a wide range of issues on building environmental systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current mathematical models in building research have been limited in most studies to linear dynamics systems. A literature review of past studies investigating chaos theory approaches in building simulation models suggests that as a basis chaos model is valid and can handle the increasing complexity of building systems that have dynamic interactions among all the distributed and hierarchical systems on the one hand, and the environment and occupants on the other. The review also identifies the paucity of literature and the need for a suitable methodology of linking chaos theory to mathematical models in building design and management studies. This study is broadly divided into two parts and presented in two companion papers. Part (I), published in the previous issue, reviews the current state of the chaos theory models as a starting point for establishing theories that can be effectively applied to building simulation models. Part (II) develop conceptual frameworks that approach current model methodologies from the theoretical perspective provided by chaos theory, with a focus on the key concepts and their potential to help to better understand the nonlinear dynamic nature of built environment systems. Case studies are also presented which demonstrate the potential usefulness of chaos theory driven models in a wide variety of leading areas of building research. This study distills the fundamental properties and the most relevant characteristics of chaos theory essential to (1) building simulation scientists and designers (2) initiating a dialogue between scientists and engineers, and (3) stimulating future research on a wide range of issues involved in designing and managing building environmental systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A myriad of methods are available for virtual screening of small organic compound databases. In this study we have successfully applied a quantitative model of consensus measurements, using a combination of 3D similarity searches (ROCS and EON), Hologram Quantitative Structure Activity Relationships (HQSAR) and docking (FRED, FlexX, Glide and AutoDock Vina), to retrieve cruzain inhibitors from collected databases. All methods were assessed individually and then combined in a Ligand-Based Virtual Screening (LBVS) and Target-Based Virtual Screening (TBVS) consensus scoring, using Receiving Operating Characteristic (ROC) curves to evaluate their performance. Three consensus strategies were used: scaled-rank-by-number, rank-by-rank and rank-by-vote, with the most thriving the scaled-rank-by-number strategy, considering that the stiff ROC curve appeared to be satisfactory in every way to indicate a higher enrichment power at early retrieval of active compounds from the database. The ligand-based method provided access to a robust and predictive HQSAR model that was developed to show superior discrimination between active and inactive compounds, which was also better than ROCS and EON procedures. Overall, the integration of fast computational techniques based on ligand and target structures resulted in a more efficient retrieval of cruzain inhibitors with desired pharmacological profiles that may be useful to advance the discovery of new trypanocidal agents.