993 resultados para Fused deposition modeling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

En aquest projecte s'ha estudiat la posada a punt d’un equip comercial ALD per a l’obtenció de capes primes d'alúmina a escala nanomètrica utilitzant vapor d’aigua i TMA com a precursors. Per tal de comprovar a bondat de les receptes experimentals aportades pel fabricant així com comprovar alguns aspectes de la teoria ALD s’han realitzat una sèrie de mostres variant els diferents paràmetres experimentals, principalment la temperatura de deposició, el nombre de cicles, la durada del cicle i el tipus de substrat. Per a la determinació dels gruixos nanomètrics de les capes i per tant dels ritmes de creixement s’ha utilitzat la el·lipsometria, una de les poques tècniques no destructives capaç de mesurar amb gran precisió gruixos de capes o interfases de pocs àngstroms o nanòmetres. En una primera etapa s'han utilitzat els valors experimentals donats pel fabricant del sistema ALD per determinar el ritme de creixement en funció de la temperatura de dipòsit i del numero de cicles, en ambdós casos sobre diversos substrats. S'ha demostrat que el ritme de creixement augmenta lleugerament en augmentar la temperatura de dipòsit, tot i que amb una variació petita, de l'ordre del 12% en variar 70ºC la temperatura de deposició. Així mateix s'ha demostrat la linealitat del gruix amb el número de cicles, tot i que no s’observa una proporcionalitat exacta. En una segona etapa s'han optimitzat els paràmetres experimentals, bàsicament els temps de purga entre pols i pols per tal de reduir considerablement les durades dels experiments realitzats a relativament baixes temperatures. En aquest cas s’ha comprovat que es mantenien els ritmes de creixement amb una diferencia del 3,6%, 4,8% i 5,5% en optimitzar el cicles en 6,65h, 8,31h, o 8,33h, respectivament. A més, per una d'aquestes condicions s’ha demostrat que es mantenia l’alta conformitat de les capes d’alúmina. A més, s'ha realitzat un estudi de l'homogeneïtat del gruix de les capes en tota la zona de dipòsit del reactor ALD. S’ha demostrat que la variació en gruix de les capes dipositades a 120ºC és com a màxim del 6,2% en una superfície de 110 cm2. Confirmant l’excepcional control de gruixos de la tècnica ALD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dengue fever is currently the most important arthropod-borne viral disease in Brazil. Mathematical modeling of disease dynamics is a very useful tool for the evaluation of control measures. To be used in decision-making, however, a mathematical model must be carefully parameterized and validated with epidemiological and entomological data. In this work, we developed a simple dengue model to answer three questions: (i) which parameters are worth pursuing in the field in order to develop a dengue transmission model for Brazilian cities; (ii) how vector density spatial heterogeneity influences control efforts; (iii) with a degree of uncertainty, what is the invasion potential of dengue virus type 4 (DEN-4) in Rio de Janeiro city. Our model consists of an expression for the basic reproductive number (R0) that incorporates vector density spatial heterogeneity. To deal with the uncertainty regarding parameter values, we parameterized the model using a priori probability density functions covering a range of plausible values for each parameter. Using the Latin Hypercube Sampling procedure, values for the parameters were generated. We conclude that, even in the presence of vector spatial heterogeneity, the two most important entomological parameters to be estimated in the field are the mortality rate and the extrinsic incubation period. The spatial heterogeneity of the vector population increases the risk of epidemics and makes the control strategies more complex. At last, we conclude that Rio de Janeiro is at risk of a DEN-4 invasion. Finally, we stress the point that epidemiologists, mathematicians, and entomologists need to interact more to find better approaches to the measuring and interpretation of the transmission dynamics of arthropod-borne diseases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have pointed out a similarity between tectonics and slope tectonic-induced structures. Numerous studies have demonstrated that structures and fabrics previously interpreted as of purely geodynamical origin are instead the result of large slope deformation, and this led in the past to erroneous interpretations. Nevertheless, their limit seems not clearly defined, but it is somehow transitional. Some studies point out continuity between failures developing at surface with upper crust movements. In this contribution, the main studies which examine the link between rock structures and slope movements are reviewed. The aspects regarding model and scale of observation are discussed together with the role of pre-existing weaknesses in the rock mass. As slope failures can develop through progressive failure, structures and their changes in time and space can be recognized. Furthermore, recognition of the origin of these structures can help in avoiding misinterpretations of regional geology. This also suggests the importance of integrating different slope movement classifications based on distribution and pattern of deformation and the application of structural geology techniques. A structural geology approach in the landslide community is a tool that can greatly support the hazard quantification and related risks, because most of the physical parameters, which are used for landslide modeling, are derived from geotechnical tests or the emerging geophysical approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural selection is typically exerted at some specific life stages. If natural selection takes place before a trait can be measured, using conventional models can cause wrong inference about population parameters. When the missing data process relates to the trait of interest, a valid inference requires explicit modeling of the missing process. We propose a joint modeling approach, a shared parameter model, to account for nonrandom missing data. It consists of an animal model for the phenotypic data and a logistic model for the missing process, linked by the additive genetic effects. A Bayesian approach is taken and inference is made using integrated nested Laplace approximations. From a simulation study we find that wrongly assuming that missing data are missing at random can result in severely biased estimates of additive genetic variance. Using real data from a wild population of Swiss barn owls Tyto alba, our model indicates that the missing individuals would display large black spots; and we conclude that genes affecting this trait are already under selection before it is expressed. Our model is a tool to correctly estimate the magnitude of both natural selection and additive genetic variance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Observations in daily practice are sometimes registered as positive values larger then a given threshold α. The sample space is in this case the interval (α,+∞), α & 0, which can be structured as a real Euclidean space in different ways. This fact opens the door to alternative statistical models depending not only on the assumed distribution function, but also on the metric which is considered as appropriate, i.e. the way differences are measured, and thus variability

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along thelast months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of the investigation of the use of automated fingerprint identification systems (AFIS) for the evaluation of fingerprint evidence, the current study presents investigations into the variability of scores from an AFIS system when fingermarks from a known donor are compared to fingerprints that are not from the same source. The ultimate goal is to propose a model, based on likelihood ratios, which allows the evaluation of mark-to-print comparisons. In particular, this model, through its use of AFIS technology, benefits from the possibility of using a large amount of data, as well as from an already built-in proximity measure, the AFIS score. More precisely, the numerator of the LR is obtained from scores issued from comparisons between impressions from the same source and showing the same minutia configuration. The denominator of the LR is obtained by extracting scores from comparisons of the questioned mark with a database of non-matching sources. This paper focuses solely on the assignment of the denominator of the LR. We refer to it by the generic term of between-finger variability. The issues addressed in this paper in relation to between-finger variability are the required sample size, the influence of the finger number and general pattern, as well as that of the number of minutiae included and their configuration on a given finger. Results show that reliable estimation of between-finger variability is feasible with 10,000 scores. These scores should come from the appropriate finger number/general pattern combination as defined by the mark. Furthermore, strategies of obtaining between-finger variability when these elements cannot be conclusively seen on the mark (and its position with respect to other marks for finger number) have been presented. These results immediately allow case-by-case estimation of the between-finger variability in an operational setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: The aim of the study was to investigate the influence of dietary intake of commercial hydrolyzed collagen (Gelatine Royal ®) on bone remodeling in pre-pubertal children. Methods: A randomized double-blind study was carried out in 60 children (9.42 ± 1.31 years) divided into three groups according to the amount of partially hydrolyzed collagen taken daily for 4 months: placebo (G-I, n = 18), collagen (G-II, n = 20) and collagen + calcium (G-III, n = 22) groups. Analyses of the following biochemical markers were carried out: total and bone alkaline phosphatase (tALP and bALP), osteocalcin, tartrate-resistant acid phosphatase (TRAP), type I collagen carboxy terminal telopeptide, lipids, calcium, 25-hydroxyvitamin D, insulin-like growth factor 1 (IGF-1), thyroid-stimulating hormone, free thyroxin and intact parathormone. Results: There was a significantly greater increase in serum IGF-1 in G-III than in G II (p < 0.01) or G-I (p < 0.05) during the study period, and a significantly greater increase in plasma tALP in G-III than in G-I (p < 0.05). Serum bALP behavior significantly (p < 0.05) differed between G-II (increase) and G-I (decrease). Plasma TRAP behavior significantly differed between G-II and G-I (p < 0.01) and between G-III and G-II (p < 0.05). Conclusion: Daily dietary intake of hydrolyzed collagen seems to have a potential role in enhancing bone remodeling at key stages of growth and development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The in situ deposition of zinc oxide on gold nanoparticles in aqueous solution has been here successfully applied in the field of fingermark detection on various non-porous surfaces. In this article, we present the improvement of the multimetal deposition, an existing technique limited up to now to non-luminescent results, by obtaining luminescent fingermarks with very good contrast and details. This is seen as a major improvement in the field in terms of selectivity and sensitivity of detection, especially on black surfaces.