772 resultados para PBL parameterization
Resumo:
The results of a pedagogical strategy implemented at the University of Sao Paulo at Sao Carlos are presented and discussed. The initiative was conducted in a transportation course offered to Civil Engineering students. The approach is a combination of problem-based learning and project-based learning (PBL) and blended-learning (B-learning). Starting in 2006, a different problem was introduced every year. From 2009 on, however, the problem-based learning concept was expanded to project-based learning. The performance of the students was analyzed using the following elements: (1) grades in course activities; (2) answers from a questionnaire designed for course evaluation; and (3) cognitive maps made to assess the effects of PBL through the comparison of the responses provided by the students involved and those not involved in the experiment. The results showed positive aspects of the method, such as a strong involvement of several students with the subject. A gradual increase in the average scores obtained by the students in the project activities (from 6.77 in 2006 to 8.24 in 2009) was concomitant with a better evaluation of these activities and of the course as a whole (90 and 97% of options "Good" or "Very good" in 2009, respectively). A growing interest in the field of transportation engineering as an alternative for further studies was also noticed. DOI: 10.1061/(ASCE)EI.1943-5541.0000115. (C) 2012 American Society of Civil Engineers.
Resumo:
[EN]We present a new strategy, based on the idea of the meccano method and a novel T-mesh optimization procedure, to construct a T-spline parameterization of 2D geometries for the application of isogeometric analysis. The proposed method only demands a boundary representation of the geometry as input data. The algorithm obtains, as a result, high quality parametric transformation between 2D objects and the parametric domain, the unit square. First, we define a parametric mapping between the input boundary of the object and the boundary of the parametric domain. Then, we build a T-mesh adapted to the geometric singularities of the domain in order to preserve the features of the object boundary with a desired tolerance...
Resumo:
[EN]We have recently introduced a new strategy, based on the meccano method [1, 2], to construct a T-spline parameterization of 2D and 3D geometries for the application of iso geometric analysis [3, 4]. The proposed method only demands a boundary representation of the geometry as input data. The algorithm obtains, as a result, high quality parametric transformation between the objects and the parametric domain, i.e. the meccano. The key of the method lies in de_ning an isomorphic transformation between the parametric and physical T-mesh _nding the optimal position of the interior nodes, once the meccano boundary nodes are mapped to the boundary of the physical domain…
Resumo:
[EN]We present a new method, based on the idea of the meccano method and a novel T-mesh optimization procedure, to construct a T-spline parameterization of 2D geometries for the application of isogeometric analysis. The proposed method only demands a boundary representation of the geometry as input data. The algorithm obtains, as a result, high quality parametric transformation between 2D objects and the parametric domain, the unit square. First, we define a parametric mapping between the input boundary of the object and the boundary of the parametric domain. Then, we build a T-mesh adapted to the geometric singularities of the domain in order to preserve the features of the object boundary with a desired tolerance…
Resumo:
Eutrophication is a persistent problem in many fresh water lakes. Delay in lake recovery following reductions in external loading of phosphorus, the limiting nutrient in fresh water ecosystems, is often observed. Models have been created to assist with lake remediation efforts, however, the application of management tools to sediment diagenesis is often neglected due to conceptual and mathematical complexity. SED2K (Chapra et al. 2012) is proposed as a "middle way", offering engineering rigor while being accessible to users. An objective of this research is to further support the development and application SED2K for sediment phosphorus diagenesis and release to the water column of Onondaga Lake. Application of SED2K has been made to eutrophic Lake Alice in Minnesota. The more homogenous sediment characteristics of Lake Alice, compared with the industrially polluted sediment layers of Onondaga Lake, allowed for an invariant rate coefficient to be applied to describe first order decay kinetics of phosphorus. When a similar approach was attempted on Onondaga Lake an invariant rate coefficient failed to simulate the sediment phosphorus profile. Therefore, labile P was accounted for by progressive preservation after burial and a rate coefficient which gradual decreased with depth was applied. In this study, profile sediment samples were chemically extracted into five operationally-defined fractions: CaCO3-P, Fe/Al-P, Biogenic-P, Ca Mineral-P and Residual-P. Chemical fractionation data, from this study, showed that preservation is not the only mechanism by which phosphorus may be maintained in a non-reactive state in the profile. Sorption has been shown to contribute substantially to P burial within the profile. A new kinetic approach involving partitioning of P into process based fractions is applied here. Results from this approach indicate that labile P (Ca Mineral and Organic P) is contributing to internal P loading to Onondaga Lake, through diagenesis and diffusion to the water column, while the sorbed P fraction (Fe/Al-P and CaCO3-P) is remaining consistent. Sediment profile concentrations of labile and total phosphorus at time of deposition were also modeled and compared with current labile and total phosphorus, to quantify the extent to which remaining phosphorus which will continue to contribute to internal P loading and influence the trophic status of Onondaga Lake. Results presented here also allowed for estimation of the depth of the active sediment layer and the attendant response time as well as the sediment burden of labile P and associated efflux.
Resumo:
Methods for optical motion capture often require timeconsuming manual processing before the data can be used for subsequent tasks such as retargeting or character animation. These processing steps restrict the applicability of motion capturing especially for dynamic VR-environments with real time requirements. To solve these problems, we present two additional, fast and automatic processing stages based on our motion capture pipeline presented in [HSK05]. A normalization step aligns the recorded coordinate systems with the skeleton structure to yield a common and intuitive data basis across different recording sessions. A second step computes a parameterization based on automatically extracted main movement axes to generate a compact motion description. Our method does not restrict the placement of marker bodies nor the recording setup, and only requires a short calibration phase.
Resumo:
Localized short-echo-time (1)H-MR spectra of human brain contain contributions of many low-molecular-weight metabolites and baseline contributions of macromolecules. Two approaches to model such spectra are compared and the data acquisition sequence, optimized for reproducibility, is presented. Modeling relies on prior knowledge constraints and linear combination of metabolite spectra. Investigated was what can be gained by basis parameterization, i.e., description of basis spectra as sums of parametric lineshapes. Effects of basis composition and addition of experimentally measured macromolecular baselines were investigated also. Both fitting methods yielded quantitatively similar values, model deviations, error estimates, and reproducibility in the evaluation of 64 spectra of human gray and white matter from 40 subjects. Major advantages of parameterized basis functions are the possibilities to evaluate fitting parameters separately, to treat subgroup spectra as independent moieties, and to incorporate deviations from straightforward metabolite models. It was found that most of the 22 basis metabolites used may provide meaningful data when comparing patient cohorts. In individual spectra, sums of closely related metabolites are often more meaningful. Inclusion of a macromolecular basis component leads to relatively small, but significantly different tissue content for most metabolites. It provides a means to quantitate baseline contributions that may contain crucial clinical information.
Resumo:
Both TBL and PBL attempt to maximally engage the learner and both are designed to encourage interactive teaching / learning. PBL is student centered. TBL, in contrast, is typically instructor centered. The PBL Executive Committee of the UTHSC-Houston Medical School, in an attempt to capture the pedagogical advantages of PBL and of TBL, implemented a unique PBL experience into the ICE/PBL course during the final block of PBL instruction in year 2. PBL cases provided the content knowledge for focused learning. The subsequent, related TBL exercises fostered integration / critical thinking about each of these cases. [See PDF for complete abstract]
Resumo:
Existing evidence of plant phenological change to temperature increase demonstrates that the phenological responsiveness is greater at warmer locations and in early-season plant species. Explanations of these findings are scarce and not settled. Some studies suggest considering phenology as one functional trait within a plant's life history strategy. In this study, we adapt an existing phenological model to derive a generalized sensitivity in space (SpaceSens) model for calculating temperature sensitivity of spring plant phenophases across species and locations. The SpaceSens model have three parameters, including the temperature at the onset date of phenophases (Tp), base temperature threshold (Tb) and the length of period (L) used to calculate the mean temperature when performing regression analysis between phenology and temperature. A case study on first leaf date of 20 plant species from eastern China shows that the change of Tp and Tb among different species accounts for interspecific difference in temperature sensitivity. Moreover, lower Tp at lower latitude is the main reason why spring phenological responsiveness is greater there. These results suggest that spring phenophases of more responsive, early-season plants (especially in low latitude) will probably continue to diverge from the other late-season plants with temperatures warming in the future.
Resumo:
Simulating surface wind over complex terrain is a challenge in regional climate modelling. Therefore, this study aims at identifying a set-up of the Weather Research and Forecasting Model (WRF) model that minimises system- atic errors of surface winds in hindcast simulations. Major factors of the model configuration are tested to find a suitable set-up: the horizontal resolution, the planetary boundary layer (PBL) parameterisation scheme and the way the WRF is nested to the driving data set. Hence, a number of sensitivity simulations at a spatial resolution of 2 km are carried out and compared to observations. Given the importance of wind storms, the analysis is based on case studies of 24 historical wind storms that caused great economic damage in Switzerland. Each of these events is downscaled using eight different model set-ups, but sharing the same driving data set. The results show that the lack of representation of the unresolved topography leads to a general overestimation of wind speed in WRF. However, this bias can be substantially reduced by using a PBL scheme that explicitly considers the effects of non-resolved topography, which also improves the spatial structure of wind speed over Switzerland. The wind direction, although generally well reproduced, is not very sensitive to the PBL scheme. Further sensitivity tests include four types of nesting methods: nesting only at the boundaries of the outermost domain, analysis nudging, spectral nudging, and the so-called re-forecast method, where the simulation is frequently restarted. These simulations show that restricting the freedom of the model to develop large-scale disturbances slightly increases the temporal agreement with the observations, at the same time that it further reduces the overestimation of wind speed, especially for maximum wind peaks. The model performance is also evaluated in the outermost domains, where the resolution is coarser. The results demonstrate the important role of horizontal resolution, where the step from 6 to 2 km significantly improves model performance. In summary, the combination of a grid size of 2 km, the non-local PBL scheme modified to explicitly account for non-resolved orography, as well as analysis or spectral nudging, is a superior combination when dynamical downscaling is aimed at reproducing real wind fields.
Resumo:
This paper explains how the Armington-Krugman-Melitz supermodel developed by Dixon and Rimmer can be parameterized, and demonstrates that only two kinds of additional information are required in order to extend a standard trade model to include Melitz-type monopolistic competition and heterogeneous firms. Further, it is shown how specifying too much additional information leads to violations of the model constraints, necessitating adjustment and reconciliation of the data. Once a Melitz-type model is parameterized, a Krugman-type model can also be parameterized using the calibrated values in the Melitz-type model without any additional data. Sample code for the General Algebraic Modeling System (GAMS) has also been prepared to promote the innovative supermodel in the AGE community.