994 resultados para parallel modeling
Resumo:
Crop models are simplified mathematical representations of the interacting biological and environmental components of the dynamic soil–plant–environment system. Sorghum crop modeling has evolved in parallel with crop modeling capability in general, since its origins in the 1960s and 1970s. Here we briefly review the trajectory in sorghum crop modeling leading to the development of advanced models. We then (i) overview the structure and function of the sorghum model in the Agricultural Production System sIMulator (APSIM) to exemplify advanced modeling concepts that suit both agronomic and breeding applications, (ii) review an example of use of sorghum modeling in supporting agronomic management decisions, (iii) review an example of the use of sorghum modeling in plant breeding, and (iv) consider implications for future roles of sorghum crop modeling. Modeling and simulation provide an avenue to explore consequences of crop management decision options in situations confronted with risks associated with seasonal climate uncertainties. Here we consider the possibility of manipulating planting configuration and density in sorghum as a means to manipulate the productivity–risk trade-off. A simulation analysis of decision options is presented and avenues for its use with decision-makers discussed. Modeling and simulation also provide opportunities to improve breeding efficiency by either dissecting complex traits to more amenable targets for genetics and breeding, or by trait evaluation via phenotypic prediction in target production regions to help prioritize effort and assess breeding strategies. Here we consider studies on the stay-green trait in sorghum, which confers yield advantage in water-limited situations, to exemplify both aspects. The possible future roles of sorghum modeling in agronomy and breeding are discussed as are opportunities related to their synergistic interaction. The potential to add significant value to the revolution in plant breeding associated with genomic technologies is identified as the new modeling frontier.
Resumo:
The goal of this study is to provide a framework for future researchers to understand and use the FARSITE wildfire-forecasting model with data assimilation. Current wildfire models lack the ability to provide accurate prediction of fire front position faster than real-time. When FARSITE is coupled with a recursive ensemble filter, the data assimilation forecast method improves. The scope includes an explanation of the standalone FARSITE application, technical details on FARSITE integration with a parallel program coupler called OpenPALM, and a model demonstration of the FARSITE-Ensemble Kalman Filter software using the FireFlux I experiment by Craig Clements. The results show that the fire front forecast is improved with the proposed data-driven methodology than with the standalone FARSITE model.
Resumo:
Irradiation is the main component for producing the electricity from solar energy. When obstacles come in between the sun and the PV cell then it doesn’t get sufficient irradiance to produce enough electricity. Shadowing has a great impact on photovoltaic cell. The main fuel of PV cell is solar radiation. Using solar radiation, a photovoltaic cell produces electricity. The shadow on a PV cell decreases the output of the photovoltaic cell. It has been already shown in different papers that shadow effect decreases the output of the PV cell. There are different kinds of shadow effects which are observed, some minimize the PV cell output and some reduce the output to zero. There are different types of shadow based on their effects on the photovoltaic cell. The shadow has also effects depending on whether the PV cells are connected in series connection or in parallel connection. In series when one cell is out of order then the whole series of the PV cells will not work but in parallel connection if one cell is damaged, the others will work because they work independently. According to the output requirement the arrangement of the PV cells are made in series or parallel. Simulink modeling is made for series and parallel connection between two PV cells and the shadow effect is analyzed on one of the PV cells. Using SIMULINK, the shadowing is simulated on the two PV cells, where in one system they are in series and in another system they are in parallel. Slowly the irradiance is decreased to simulate the shadow effect. Simulation of the shadow effect gives an idea about the output of the PV cell system when system has shadow on the PV cells. Here the shadow effect on the two PV cells using series and parallel combinations are simulated and analyzed for understanding the effects on output.
Resumo:
In this project an optimal pose selection method for the calibration of an overconstrained Cable-Driven Parallel robot is presented. This manipulator belongs to a subcategory of parallel robots, where the classic rigid "legs" are replaced by cables. Cables are flexible elements that bring advantages and disadvantages to the robot modeling. For this reason, there are many open research issues, and the calibration of geometric parameters is one of them. The identification of the geometry of a robot, in particular, is usually called Kinematic Calibration. Many methods have been proposed in the past years for the solution of the latter problem. Although these methods are based on calibration using different kinematic models, when the robot’s geometry becomes more complex, their robustness and reliability decrease. This fact makes the selection of the calibration poses more complicated. The position and the orientation of the endeffector in the workspace become important in terms of selection. Thus, in general, it is necessary to evaluate the robustness of the chosen calibration method, by means, for example, of a parameter such as the observability index. In fact, it is known from the theory, that the maximization of the above mentioned index identifies the best choice of calibration poses, and consequently, using this pose set may improve the calibration process. The objective of this thesis is to analyze optimization algorithms which aim to calculate an optimal choice of poses both in quantitative and qualitative terms. Quantitatively, because it is of fundamental importance to understand how many poses are needed. Not necessarily a greater number of poses leads to a better result. Qualitatively, because it is useful to understand if the selected combination of poses actually gives additional information in the process of the identification of the parameters.
Resumo:
Cable-driven parallel robots offer significant advantages in terms of workspace dimensions and payload capability. They are attractive for many industrial tasks to be performed on a large scale, such as handling and manufacturing, without a substantial increase in costs and mechanical complexity with respect to a small-scale application. However, since cables can only sustain tensile stresses, cable tensions must be kept within positive limits during the end-effector motion. This problem can be managed by overconstraining the end-effector and controlling cable tensions. Tension control is typically achieved by mounting a load sensor on all cables, and using specific control algorithms to avoid cable slackness or breakage while the end-effector is controlled in a desired position. These algorithms require multiple cascade control loops and they can be complex and computationally demanding. To simplify the control of overconstrained cable-driven parallel robots, this Thesis proposes suitable mechanical design and hybrid control strategies. It is shown how a convenient design of the cable guidance system allows kinematic modeling to be simplified, without introducing geometric approximations. This guidance system employs swiveling pulleys equipped with position and tension sensors and provides a parallelogram arrangement of cables. Furthermore, a hybrid force/position control in the robot joint space is adopted. According to this strategy, a particular set of cables is chosen to be tension-controlled, whereas the other cables are length-controlled. The force-controlled cables are selected based on the computation of a novel index called force-distribution sensitivity to cable-tension errors. This index aims to evaluate the maximum expected cable-tension error in the length-controlled cables if a unit tension error is committed in the force-controlled cables. In practice, the computation of the force-distribution sensitivity allows determining which cables are best to be force-controlled, to ensure the lowest error in the overall force distribution when a hybrid force/position joint-space strategy is used.
Resumo:
Clear cell sarcoma of the kidney (CCSK) is the second most common pediatric renal tumor, characterized in 90% of cases by the presence of internal tandem duplications (ITDs) localized at the last exon of BCOR gene. BCOR protein constitute a core component of the non-canonical Polycomb Repressive Complex1 (PRC1.1), which performs a fundamental silencing activity. ITDs in the last BCOR exon at the level of PUFD domain have been identified in many tumor subtypes and could affect PCGF1 binding and the subsequent PRC1.1 activity, although the exact oncogenic mechanism of ITD remains poorly understood. This project has the objective of investigating the molecular mechanisms underlying the oncogenesis of CCSK, approaching the study with different methodologies. A first model in HEK-293 allowed to obtain important informations about BCOR functionality, suggesting that the presence of ITD generates an altered activity which is very different from a loss-of-function. It has also been observed that BCOR function within the PRC1.1 complex varies with different ITDs. Moreover, it allowed the identification of molecular signatures evoked by the presence of BCOR-ITD, including its role in extracellular matrix interactions and invasiveness promotion. The parallel analysis of WTS data from 8 CCSK cases permitted the identification of a peculiar signature for metastatic CCSKs, highlighting a 20-fold overexpression of FGF3. This factor promoted a significant increase in invasive ability in the cellular model. In order to study BCOR-ITD effects over cell stemness and differentiation, an inducible model is being obtained in H1 cells. This way, it will be possible to study the functionality of BCOR-ITD in a context more similar to the origin of CCSKs, evaluating both the specific interactome and phenotypic consequences caused by the mutation.
Resumo:
In this study, the transmission-line modeling (TLM) applied to bio-thermal problems was improved by incorporating several novel computational techniques, which include application of graded meshes which resulted in 9 times faster in computational time and uses only a fraction (16%) of the computational resources used by regular meshes in analyzing heat flow through heterogeneous media. Graded meshes, unlike regular meshes, allow heat sources to be modeled in all segments of the mesh. A new boundary condition that considers thermal properties and thus resulting in a more realistic modeling of complex problems is introduced. Also, a new way of calculating an error parameter is introduced. The calculated temperatures between nodes were compared against the results obtained from the literature and agreed within less than 1% difference. It is reasonable, therefore, to conclude that the improved TLM model described herein has great potential in heat transfer of biological systems.
Resumo:
American tegumentary leishmaniasis (ATL) is a disease transmitted to humans by the female sandflies of the genus Lutzomyia. Several factors are involved in the disease transmission cycle. In this work only rainfall and deforestation were considered to assess the variability in the incidence of ATL. In order to reach this goal, monthly recorded data of the incidence of ATL in Orán, Salta, Argentina, were used, in the period 1985-2007. The square root of the relative incidence of ATL and the corresponding variance were formulated as time series, and these data were smoothed by moving averages of 12 and 24 months, respectively. The same procedure was applied to the rainfall data. Typical months, which are April, August, and December, were found and allowed us to describe the dynamical behavior of ATL outbreaks. These results were tested at 95% confidence level. We concluded that the variability of rainfall would not be enough to justify the epidemic outbreaks of ATL in the period 1997-2000, but it consistently explains the situation observed in the years 2002 and 2004. Deforestation activities occurred in this region could explain epidemic peaks observed in both years and also during the entire time of observation except in 2005-2007.
Resumo:
In this work, all publicly-accessible published findings on Alicyclobacillus acidoterrestris heat resistance in fruit beverages as affected by temperature and pH were compiled. Then, study characteristics (protocols, fruit and variety, °Brix, pH, temperature, heating medium, culture medium, inactivation method, strains, etc.) were extracted from the primary studies, and some of them incorporated to a meta-analysis mixed-effects linear model based on the basic Bigelow equation describing the heat resistance parameters of this bacterium. The model estimated mean D* values (time needed for one log reduction at a temperature of 95 °C and a pH of 3.5) of Alicyclobacillus in beverages of different fruits, two different concentration types, with and without bacteriocins, and with and without clarification. The zT (temperature change needed to cause one log reduction in D-values) estimated by the meta-analysis model were compared to those ('observed' zT values) reported in the primary studies, and in all cases they were within the confidence intervals of the model. The model was capable of predicting the heat resistance parameters of Alicyclobacillus in fruit beverages beyond the types available in the meta-analytical data. It is expected that the compilation of the thermal resistance of Alicyclobacillus in fruit beverages, carried out in this study, will be of utility to food quality managers in the determination or validation of the lethality of their current heat treatment processes.
Resumo:
The caffeine solubility in supercritical CO2 was studied by assessing the effects of pressure and temperature on the extraction of green coffee oil (GCO). The Peng-Robinson¹ equation of state was used to correlate the solubility of caffeine with a thermodynamic model and two mixing rules were evaluated: the classical mixing rule of van der Waals with two adjustable parameters (PR-VDW) and a density dependent one, proposed by Mohamed and Holder² with two (PR-MH, two parameters adjusted to the attractive term) and three (PR-MH3 two parameters adjusted to the attractive and one to the repulsive term) adjustable parameters. The best results were obtained with the mixing rule of Mohamed and Holder² with three parameters.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Evolving interfaces were initially focused on solutions to scientific problems in Fluid Dynamics. With the advent of the more robust modeling provided by Level Set method, their original boundaries of applicability were extended. Specifically to the Geometric Modeling area, works published until then, relating Level Set to tridimensional surface reconstruction, centered themselves on reconstruction from a data cloud dispersed in space; the approach based on parallel planar slices transversal to the object to be reconstructed is still incipient. Based on this fact, the present work proposes to analyse the feasibility of Level Set to tridimensional reconstruction, offering a methodology that simultaneously integrates the proved efficient ideas already published about such approximation and the proposals to process the inherent limitations of the method not satisfactorily treated yet, in particular the excessive smoothing of fine characteristics of contours evolving under Level Set. In relation to this, the application of the variant Particle Level Set is suggested as a solution, for its intrinsic proved capability to preserve mass of dynamic fronts. At the end, synthetic and real data sets are used to evaluate the presented tridimensional surface reconstruction methodology qualitatively.
Resumo:
Onion (Allium cepa) is one of the most cultivated and consumed vegetables in Brazil and its importance is due to the large laborforce involved. One of the main pests that affect this crop is the Onion Thrips (Thrips tabaci), but the spatial distribution of this insect, although important, has not been considered in crop management recommendations, experimental planning or sampling procedures. Our purpose here is to consider statistical tools to detect and model spatial patterns of the occurrence of the onion thrips. In order to characterize the spatial distribution pattern of the Onion Thrips a survey was carried out to record the number of insects in each development phase on onion plant leaves, on different dates and sample locations, in four rural properties with neighboring farms under different infestation levels and planting methods. The Mantel randomization test proved to be a useful tool to test for spatial correlation which, when detected, was described by a mixed spatial Poisson model with a geostatistical random component and parameters allowing for a characterization of the spatial pattern, as well as the production of prediction maps of susceptibility to levels of infestation throughout the area.
Resumo:
Below cloud scavenging processes have been investigated considering a numerical simulation, local atmospheric conditions and particulate matter (PM) concentrations, at different sites in Germany. The below cloud scavenging model has been coupled with bulk particulate matter counter TSI (Trust Portacounter dataset, consisting of the variability prediction of the particulate air concentrations during chosen rain events. The TSI samples and meteorological parameters were obtained during three winter Campaigns: at Deuselbach, March 1994, consisting in three different events; Sylt, April 1994 and; Freiburg, March 1995. The results show a good agreement between modeled and observed air concentrations, emphasizing the quality of the conceptual model used in the below cloud scavenging numerical modeling. The results between modeled and observed data have also presented high square Pearson coefficient correlations over 0.7 and significant, except the Freiburg Campaign event. The differences between numerical simulations and observed dataset are explained by the wind direction changes and, perhaps, the absence of advection mass terms inside the modeling. These results validate previous works based on the same conceptual model.
Resumo:
The enzyme purine nucleoside phosphorylase from Schistosoma mansoni (SmPNP) is an attractive molecular target for the development of novel drugs against schistosomiasis, a neglected tropical disease that affects about 200 million people worldwide. In the present work, enzyme kinetic studies were carried out in order to determine the potency and mechanism of inhibition of a series of SmPNP inhibitors. In addition to the biochemical investigations, crystallographic and molecular modeling studies revealed important molecular features for binding affinity towards the target enzyme, leading to the development of structure-activity relationships (SAR).