874 resultados para Filmic approach methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, sufficient conditions for the existence of switching laws for stabilizing switched TS fuzzy systems via a fuzzy Lyapunov function are proposed. The conditions are found by exploring properties of the membership functions and are formulated in terms of linear matrix inequalities (LMIs). Stabilizing switching conditions with bounds on the decay rate solution and H1 performance are also obtained. Numerical examples illustrate the effectiveness of the proposed design methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One approach to verify the adequacy of estimation methods of reference evapotranspiration is the comparison with the Penman-Monteith method, recommended by the United Nations of Food and Agriculture Organization - FAO, as the standard method for estimating ET0. This study aimed to compare methods for estimating ET0, Makkink (MK), Hargreaves (HG) and Solar Radiation (RS), with Penman-Monteith (PM). For this purpose, we used daily data of global solar radiation, air temperature, relative humidity and wind speed for the year 2010, obtained through the automatic meteorological station, with latitude 18° 91' 66 S, longitude 48° 25' 05 W and altitude of 869m, at the National Institute of Meteorology situated in the Campus of Federal University of Uberlandia - MG, Brazil. Analysis of results for the period were carried out in daily basis, using regression analysis and considering the linear model y = ax, where the dependent variable was the method of Penman-Monteith and the independent, the estimation of ET0 by evaluated methods. Methodology was used to check the influence of standard deviation of daily ET0 in comparison of methods. The evaluation indicated that methods of Solar Radiation and Penman-Monteith cannot be compared, yet the method of Hargreaves indicates the most efficient adjustment to estimate ETo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the Dental Sculpture and Anatomy discipline is to introduce undergraduate students to the study of the anatomic and morphological characteristics of permanent and primary human dentition, through classes, books and cognitive and psychomotor activities. This discipline supports the teaching of specific knowledge necessary for a more extensive education, involving interdisciplinarity as a means of knowledge exchange among several areas of dentistry, to achieve comprehensive professional education. Students must recognize the dental morphology from samples of preserved teeth, and reproduce the morphology through three-dimensional models made of stone or wax blocks. In this article, the authors describe the process for producing teeth collars and macro dental models made of stone, their importance and benefits of utilization. The purpose of the study was to encourage the teaching of Dental Sculpture and Anatomy toundergraduate students of the Bauru School of Dentistry, University of Sao Paulo, through activities that would associate theory, practice and the development of manual skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modeling is a step to perform a finite element analysis. Different methods of model construction are reported in literature, as the Bio-CAD modeling. The purpose of this study was to perform a model evaluation and application using two methods of Bio-CAD modeling from human edentulous hemi-mandible on the finite element analysis. From CT scans of dried human skull was reconstructed a stereolithographic model. Two methods of modeling were performed: STL conversion approach (Model 1) associated to STL simplification and reverse engineering approach (Model 2). For finite element analysis was used the action of lateral pterygoid muscle as loading condition to assess total displacement (D), equivalent von-Mises stress (VM) and maximum principal stress (MP). Two models presented differences on the geometry regarding surface number (1834 (model 1); 282 (model 2)). Were observed differences in finite element mesh regarding element number (30428 nodes/16683 elements (model 1); 15801 nodes/8410 elements (model 2). D, VM and MP stress areas presented similar distribution in two models. The values were different regarding maximum and minimum values of D (ranging 0-0.511 mm (model 1) and 0-0.544 mm (model 2), VM stress (6.36E-04-11.4 MPa (model 1) and 2.15E-04-14.7 MPa (model 2) and MP stress (-1.43-9.14 MPa (model 1) and -1.2-11.6 MPa (model 2). From two methods of Bio-CAD modeling, the reverse engineering presented better anatomical representation compared to the STL conversion approach. The models presented differences in the finite element mesh, total displacement and stress distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on image processing has shown that combining segmentation methods may lead to a solid approach to extract semantic information from different sort of images. Within this context, the Normalized Cut (NCut) is usually used as a final partitioning tool for graphs modeled in some chosen method. This work explores the Watershed Transform as a modeling tool, using different criteria of the hierarchical Watershed to convert an image into an adjacency graph. The Watershed is combined with an unsupervised distance learning step that redistributes the graph weights and redefines the Similarity matrix, before the final segmentation step using NCut. Adopting the Berkeley Segmentation Data Set and Benchmark as a background, our goal is to compare the results obtained for this method with previous work to validate its performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a novel method to calculate the electronic Density of States (DOS) of a two dimensional disordered binary alloy. The method is highly reliable and numerically efficient, and Short Range Order (SRO) correlations can be included with no extra computational cost. The approach devised rests on one dimensional calculations and is applied to very long stripes of finite width, the bulk regime being achieved with a relatively small number of chains in the disordered case. Our approach is exact for the pure case and predicts the correct DOS structure in important limits, such as the segregated, random, and ordered alloy regimes. We also suggest important extensions of the present work. © 1995.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biocompatible inorganic nano- and microcarriers can be suitable candidates for protein delivery. This study demonstrates facile methods of functionalization by using nanoscale linker molecules to change the protein adsorption capacity of hydroxyapatite (HA) powder. The adsorption capacity of bovine serum albumin as a model protein has been studied with respect to the surface modifications. The selected linker molecules (lysine, arginine, and phosphoserine) can influence the adsorption capacity by changing the electrostatic nature of the HA surface. Qualitative and quantitative analyses of linker-molecule interactions with the HA surface have been performed by using NMR spectroscopy, zeta-potential measurements, X-ray photoelectron spectroscopy, and thermogravimetric analyses. Additionally, correlations to theoretical isotherm models have been calculated with respect to Langmuir and Freundlich isotherms. Lysine and arginine increased the protein adsorption, whereas phosphoserine reduced the protein adsorption. The results show that the adsorption capacity can be controlled with different functionalization, depending on the protein-carrier selections under consideration. The scientific knowledge acquired from this study can be applied in various biotechnological applications that involve biomolecule-inorganic material interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this paper is to present a benefit-cost ranking of 127 civil transport aircraft; this ranking was determined considering a new data envelopment analysis (DEA) approach, called triple index, which combines three assessment methods: 1) standard frontier, 2) inverted index; 3) cross-multiplicative index. The analysis used the following inputs: a) market price; b) direct operating costs; and as outputs: a) payload, b) cruise speed; c) maximum rate of climb with a single engine. To ensure the homogeneity of the units, the aircrafts were divided according to the propulsion system (jet and turboprop) and size (regional, narrow-body and wide-body); they were also evaluated according to different ranges in order to identify the aircraft with the best cost-benefit relationship for each option.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A demographic model is developed based on interbirth intervals and is applied to estimate the population growth rate of humpback whales (Megaptera novaeangliae) in the Gulf of Maine. Fecundity rates in this model are based on the probabilities of giving birth at time t after a previous birth and on the probabilities of giving birth first at age x. Maximum likelihood methods are used to estimate these probabilities using sighting data collected for individually identified whales. Female survival rates are estimated from these same sighting data using a modified Jolly–Seber method. The youngest age at first parturition is 5 yr, the estimated mean birth interval is 2.38 yr (SE = 0.10 yr), the estimated noncalf survival rate is 0.960 (SE = 0.008), and the estimated calf survival rate is 0.875 (SE = 0.047). The population growth rate (l) is estimated to be 1.065; its standard error is estimated as 0.012 using a Monte Carlo approach, which simulated sampling from a hypothetical population of whales. The simulation is also used to investigate the bias in estimating birth intervals by previous methods. The approach developed here is applicable to studies of other populations for which individual interbirth intervals can be measured.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider a fully model-based approach for the analysis of distance sampling data. Distance sampling has been widely used to estimate abundance (or density) of animals or plants in a spatially explicit study area. There is, however, no readily available method of making statistical inference on the relationships between abundance and environmental covariates. Spatial Poisson process likelihoods can be used to simultaneously estimate detection and intensity parameters by modeling distance sampling data as a thinned spatial point process. A model-based spatial approach to distance sampling data has three main benefits: it allows complex and opportunistic transect designs to be employed, it allows estimation of abundance in small subregions, and it provides a framework to assess the effects of habitat or experimental manipulation on density. We demonstrate the model-based methodology with a small simulation study and analysis of the Dubbo weed data set. In addition, a simple ad hoc method for handling overdispersion is also proposed. The simulation study showed that the model-based approach compared favorably to conventional distance sampling methods for abundance estimation. In addition, the overdispersion correction performed adequately when the number of transects was high. Analysis of the Dubbo data set indicated a transect effect on abundance via Akaike’s information criterion model selection. Further goodness-of-fit analysis, however, indicated some potential confounding of intensity with the detection function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Double-observer line transect methods are becoming increasingly widespread, especially for the estimation of marine mammal abundance from aerial and shipboard surveys when detection of animals on the line is uncertain. The resulting data supplement conventional distance sampling data with two-sample mark–recapture data. Like conventional mark–recapture data, these have inherent problems for estimating abundance in the presence of heterogeneity. Unlike conventional mark–recapture methods, line transect methods use knowledge of the distribution of a covariate, which affects detection probability (namely, distance from the transect line) in inference. This knowledge can be used to diagnose unmodeled heterogeneity in the mark–recapture component of the data. By modeling the covariance in detection probabilities with distance, we show how the estimation problem can be formulated in terms of different levels of independence. At one extreme, full independence is assumed, as in the Petersen estimator (which does not use distance data); at the other extreme, independence only occurs in the limit as detection probability tends to one. Between the two extremes, there is a range of models, including those currently in common use, which have intermediate levels of independence. We show how this framework can be used to provide more reliable analysis of double-observer line transect data. We test the methods by simulation, and by analysis of a dataset for which true abundance is known. We illustrate the approach through analysis of minke whale sightings data from the North Sea and adjacent waters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Static analysis tools report software defects that may or may not be detected by other verification methods. Two challenges complicating the adoption of these tools are spurious false positive warnings and legitimate warnings that are not acted on. This paper reports automated support to help address these challenges using logistic regression models that predict the foregoing types of warnings from signals in the warnings and implicated code. Because examining many potential signaling factors in large software development settings can be expensive, we use a screening methodology to quickly discard factors with low predictive power and cost-effectively build predictive models. Our empirical evaluation indicates that these models can achieve high accuracy in predicting accurate and actionable static analysis warnings, and suggests that the models are competitive with alternative models built without screening.