871 resultados para Rejection-sampling Algorithm
Resumo:
Taking into account that the sampling intensity of soil attributes is a determining factor for applying of concepts of precision agriculture, this study aims to determine the spatial distribution pattern of soil attributes and corn yield at four soil sampling intensities and verify how sampling intensity affects cause-effect relationship between soil attributes and corn yield. A 100-referenced point sample grid was imposed on the experimental site. Thus, each sampling cell encompassed an area of 45 m² and was composed of five 10-m long crop rows, where referenced points were considered the center of the cell. Samples were taken from at 0 to 0.1 m and 0.1 to 0.2 m depths. Soil chemical attributes and clay content were evaluated. Sampling intensities were established by initial 100-point sampling, resulting data sets of 100; 75; 50 and 25 points. The data were submitted to descriptive statistical and geostatistics analyses. The best sampling intensity to know the spatial distribution pattern was dependent on the soil attribute being studied. The attributes P and K+ content showed higher spatial variability; while the clay content, Ca2+, Mg2+ and base saturation values (V) showed lesser spatial variability. The spatial distribution pattern of clay content and V at the 100-point sampling were the ones which best explained the spatial distribution pattern of corn yield.
Resumo:
It is presented a software developed with Delphi programming language to compute the reservoir's annual regulated active storage, based on the sequent-peak algorithm. Mathematical models used for that purpose generally require extended hydrological series. Usually, the analysis of those series is performed with spreadsheets or graphical representations. Based on that, it was developed a software for calculation of reservoir active capacity. An example calculation is shown by 30-years (from 1977 to 2009) monthly mean flow historical data, from Corrente River, located at São Francisco River Basin, Brazil. As an additional tool, an interface was developed to manage water resources, helping to manipulate data and to point out information that it would be of interest to the user. Moreover, with that interface irrigation districts where water consumption is higher can be analyzed as a function of specific seasonal water demands situations. From a practical application, it is possible to conclude that the program provides the calculation originally proposed. It was designed to keep information organized and retrievable at any time, and to show simulation on seasonal water demands throughout the year, contributing with the elements of study concerning reservoir projects. This program, with its functionality, is an important tool for decision making in the water resources management.
Resumo:
Several equipments and methodologies have been developed to make available precision agriculture, especially considering the high cost of its implantation and sampling. An interesting possibility is to define management zones aim at dividing producing areas in smaller management zones that could be treated differently, serving as a source of recommendation and analysis. Thus, this trial used physical and chemical properties of soil and yield aiming at the generation of management zones in order to identify whether they can be used as recommendation and analysis. Management zones were generated by the Fuzzy C-Means algorithm and their evaluation was performed by calculating the reduction of variance and performing means tests. The division of the area into two management zones was considered appropriate for the present distinct averages of most soil properties and yield. The used methodology allowed the generation of management zones that can serve as source of recommendation and soil analysis; despite the relative efficiency has shown a reduced variance for all attributes in divisions in the three sub-regions, the ANOVA did not show significative differences among the management zones.
Resumo:
ABSTRACT This study aimed to compare thematic maps of soybean yield for different sampling grids, using geostatistical methods (semivariance function and kriging). The analysis was performed with soybean yield data in t ha-1 in a commercial area with regular grids with distances between points of 25x25 m, 50x50 m, 75x75 m, 100x100 m, with 549, 188, 66 and 44 sampling points respectively; and data obtained by yield monitors. Optimized sampling schemes were also generated with the algorithm called Simulated Annealing, using maximization of the overall accuracy measure as a criterion for optimization. The results showed that sample size and sample density influenced the description of the spatial distribution of soybean yield. When the sample size was increased, there was an increased efficiency of thematic maps used to describe the spatial variability of soybean yield (higher values of accuracy indices and lower values for the sum of squared estimation error). In addition, more accurate maps were obtained, especially considering the optimized sample configurations with 188 and 549 sample points.
Resumo:
Parameter estimation still remains a challenge in many important applications. There is a need to develop methods that utilize achievements in modern computational systems with growing capabilities. Owing to this fact different kinds of Evolutionary Algorithms are becoming an especially perspective field of research. The main aim of this thesis is to explore theoretical aspects of a specific type of Evolutionary Algorithms class, the Differential Evolution (DE) method, and implement this algorithm as codes capable to solve a large range of problems. Matlab, a numerical computing environment provided by MathWorks inc., has been utilized for this purpose. Our implementation empirically demonstrates the benefits of a stochastic optimizers with respect to deterministic optimizers in case of stochastic and chaotic problems. Furthermore, the advanced features of Differential Evolution are discussed as well as taken into account in the Matlab realization. Test "toycase" examples are presented in order to show advantages and disadvantages caused by additional aspects involved in extensions of the basic algorithm. Another aim of this paper is to apply the DE approach to the parameter estimation problem of the system exhibiting chaotic behavior, where the well-known Lorenz system with specific set of parameter values is taken as an example. Finally, the DE approach for estimation of chaotic dynamics is compared to the Ensemble prediction and parameter estimation system (EPPES) approach which was recently proposed as a possible solution for similar problems.
Resumo:
The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.
Resumo:
The determination of the intersection curve between Bézier Surfaces may be seen as the composition of two separated problems: determining initial points and tracing the intersection curve from these points. The Bézier Surface is represented by a parametric function (polynomial with two variables) that maps a point in the tridimensional space from the bidimensional parametric space. In this article, it is proposed an algorithm to determine the initial points of the intersection curve of Bézier Surfaces, based on the solution of polynomial systems with the Projected Polyhedral Method, followed by a method for tracing the intersection curves (Marching Method with differential equations). In order to allow the use of the Projected Polyhedral Method, the equations of the system must be represented in terms of the Bernstein basis, and towards this goal it is proposed a robust and reliable algorithm to exactly transform a multivariable polynomial in terms of power basis to a polynomial written in terms of Bernstein basis .
Resumo:
In this paper we present an algorithm for the numerical simulation of the cavitation in the hydrodynamic lubrication of journal bearings. Despite the fact that this physical process is usually modelled as a free boundary problem, we adopted the equivalent variational inequality formulation. We propose a two-level iterative algorithm, where the outer iteration is associated to the penalty method, used to transform the variational inequality into a variational equation, and the inner iteration is associated to the conjugate gradient method, used to solve the linear system generated by applying the finite element method to the variational equation. This inner part was implemented using the element by element strategy, which is easily parallelized. We analyse the behavior of two physical parameters and discuss some numerical results. Also, we analyse some results related to the performance of a parallel implementation of the algorithm.
Resumo:
Julkaisumaa: 158 TW TWN Taiwan
Resumo:
Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.
Resumo:
The purpose of the present study was to investigate the expression (mRNA) of CD40 ligand (CD40L), interferon-gamma (IFN-gamma) and Fas ligand (FasL) genes in human cardiac allografts in relation to the occurrence of acute cardiac allograft rejection as well as its possible value in predicting acute rejection. The mRNA levels were determined by a semiquantitative reverse transcriptase-polymerase chain reaction method in 39 samples of endomyocardial biopsies obtained from 10 adult cardiac transplant recipients within the first six months after transplantation. Biopsies with ongoing acute rejection showed significantly higher CD40L, IFN-gamma and FasL mRNA expression than biopsies without rejection. The median values of mRNA expression in biopsies with and without rejection were 0.116 and zero for CD40L (P<0.003), 0.080 and zero for IFN-gamma (P<0.0009), and 0.156 and zero for FasL (P<0.002), respectively. In addition, the levels of IFN-gamma mRNA were significantly increased 7 to 15 days before the appearance of histological evidence of rejection (median of 0.086 in pre-rejection biopsies), i.e., they presented a predictive value. This study provides further evidence of heightened expression of immune activation genes during rejection and shows that some of these markers may present predictive value for the occurrence of acute rejection.
Resumo:
Bioanalytical data from a bioequivalence study were used to develop limited-sampling strategy (LSS) models for estimating the area under the plasma concentration versus time curve (AUC) and the peak plasma concentration (Cmax) of 4-methylaminoantipyrine (MAA), an active metabolite of dipyrone. Twelve healthy adult male volunteers received single 600 mg oral doses of dipyrone in two formulations at a 7-day interval in a randomized, crossover protocol. Plasma concentrations of MAA (N = 336), measured by HPLC, were used to develop LSS models. Linear regression analysis and a "jack-knife" validation procedure revealed that the AUC0-¥ and the Cmax of MAA can be accurately predicted (R²>0.95, bias <1.5%, precision between 3.1 and 8.3%) by LSS models based on two sampling times. Validation tests indicate that the most informative 2-point LSS models developed for one formulation provide good estimates (R²>0.85) of the AUC0-¥ or Cmax for the other formulation. LSS models based on three sampling points (1.5, 4 and 24 h), but using different coefficients for AUC0-¥ and Cmax, predicted the individual values of both parameters for the enrolled volunteers (R²>0.88, bias = -0.65 and -0.37%, precision = 4.3 and 7.4%) as well as for plasma concentration data sets generated by simulation (R²>0.88, bias = -1.9 and 8.5%, precision = 5.2 and 8.7%). Bioequivalence assessment of the dipyrone formulations based on the 90% confidence interval of log-transformed AUC0-¥ and Cmax provided similar results when either the best-estimated or the LSS-derived metrics were used.
Resumo:
Organ transplantation can be considered as replacement therapy for patients with end-stage organ failure. The percent of one-year allograft survival has increased due, among other factors, to a better understanding of the rejection process and new immunosuppressive drugs. Immunosuppressive therapy used in transplantation prevents activation and proliferation of alloreactive T lymphocytes, although not fully preventing chronic rejection. Recognition by recipient T cells of alloantigens expressed by donor tissues initiates immune destruction of allogeneic transplants. However, there is controversy concerning the relative contribution of CD4+ and CD8+ T cells to allograft rejection. Some animal models indicate that there is an absolute requirement for CD4+ T cells in allogeneic rejection, whereas in others CD4-depleted mice reject certain types of allografts. Moreover, there is evidence that CD8+ T cells are more resistant to immunotherapy and tolerance induction protocols. An intense focal infiltration of mainly CD8+CTLA4+ T lymphocytes during kidney rejection has been described in patients. This suggests that CD8+ T cells could escape from immunosuppression and participate in the rejection process. Our group is primarily interested in the immune mechanisms involved in allograft rejection. Thus, we believe that a better understanding of the role of CD8+ T cells in allograft rejection could indicate new targets for immunotherapy in transplantation. Therefore, the objective of the present review was to focus on the role of the CD8+ T cell population in the rejection of allogeneic tissue.
Resumo:
Since the late 1990’s, a group of moral doctrines called prioritarianism has received a lot of interest from many moral philosophers. Many contemporary moral philosophers are attracted to prioritarianism to such an extent that they can be called prioritarians. In this book, however, I reject prioritarianism, including not only “pure” prioritarianism but also hybrid prioritarian views which mix one or more non-prioritarian elements with prioritarianism. This book largely revolves around certain problems and complications of prioritarianism and its particular forms. Those problems and complications are connected to risk, impartiality, the arbitrariness of prioritarian weightings and possible future individuals. On the one hand, I challenge prioritarianism through targeted objections to various specific forms of prioritarianism. All those targeted objections are connected to risk or possible future individuals. It seems to me that together they give good grounds for believing that prioritarianism is not the way to go. On the other hand, I challenge prioritarianism by pointing out and discussing certain general problems of prioritarianism. Those general problems are connected to impartiality and the arbitrariness of prioritarian weightings. They may give additional grounds for believing that all prioritarian views should be rejected. Prioritarianism can be seen as a type of weighted utilitarianism and thus as an extension of utilitarianism. Utilitarianism is morally ultimately concerned, and morally ultimately concerned only, with some kind of maximization of utility or expected utility. Prioritarianism, on the other hand, is morally ultimately concerned, and morally ultimately concerned only, with some kind of maximization of priority-weighted utility, expected priority-weighted utility or priority-weighted expected utility. Thus prioritarianism, unlike utilitarianism, is a distribution-sensitive moral view. Besides rejecting prioritarianism, I reject also various other distribution-sensitive moral views in this book. However, I do not reject distribution-sensitivity in morality, as I end up endorsing a type of distribution-sensitive hybrid utilitarianism which mixes non-utilitarian elements with utilitarianism.
Resumo:
We studied the effect of oral and portal vein administration of alloantigens on mouse skin allograft survival. Graft receptor BALB/c mice received spleen cells (30, 90, 150 or 375 x 10(6)) from donor C57BL/6 mice intragastrically on three successive days, starting seven days before the skin graft. Allograft survival was significantly increased with the feeding of 150 x 10(6) allogeneic spleen cells by one gavage (median survival of 12 vs 14 days, P <= 0.005) or when 300 x 10(6) cells were given in six gavage (12 vs 14 days, P < 0.04). A similar effect was observed when 150 x 10(6) spleen cells were injected into the portal vein (12 vs 14 days, P <= 0.03). Furthermore, prolonged allograft survival was observed with subcutaneous (12 vs 16 days, P <= 0.002) or systemic (12 vs 15 days, P <= 0.016) application of murine interleukin-4 (IL-4), alone or in combination with spleen cell injection into the portal vein (12 vs 18 days, P <= 0.0018). Taken together, these results showed that tolerance induction with spleen cells expressing fully incompatible antigens by oral administration or intraportal injection partially down-modulates skin allograft rejection. Furthermore, these findings demonstrated for the first time the effect of subcutaneous or systemic IL-4 application on allograft skin survival suggesting its use as a beneficial support therapy in combination with a tolerance induction protocol.