925 resultados para Models and Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the single building information model has existed for at least thirty years and various standards have been published leading up to the ten-year development of the Industry Foundation Classes. These have been initiatives from researchers, software developers and standards committees. Now large property owners are becoming aware of the benefits of moving IT tools from specific applications towards more comprehensive solutions. This study addresses the state of Building Information Models and the conditions necessary for them to become more widely used. It is a qualitative study based on information from a number of international experts and has asked a series of questions about the feasibility of BIMs, the conditions necessary for their success, and the role of standards with particular reference to the IFCs. Some key statements were distilled from the diverse answers received and indicate that BIM solutions appear too complex for many and may need to be applied in limited areas initially. Standards are generally supported but not applied rigorously and a range of these are relevant to BIM. Benefits will depend upon the building procurement methods used and there should be special roles within the project team to manage information. Case studies are starting to appear and these could be used for publicity. The IFCs are rather oversold and their complexities should be hidden within simple-to-use software. Inevitably major questions remain and property owners may be the key to answering some of these. A framework for presenting standards, backed up by case studies of successful projects, is the solution proposed to provide better information on where particular BIM standards and solutions should be applied in building projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes two methodologies for damage identification from measured natural frequencies of a contiguously damaged reinforced concrete beam, idealised with distributed damage model. The first method identifies damage from Iso-Eigen-Value-Change contours, plotted between pairs of different frequencies. The performance of the method is checked for a wide variation of damage positions and extents. The method is also extended to a discrete structure in the form of a five-storied shear building and the simplicity of the method is demonstrated. The second method is through smeared damage model, where the damage is assumed constant for different segments of the beam and the lengths and centres of these segments are the known inputs. First-order perturbation method is used to derive the relevant expressions. Both these methods are based on distributed damage models and have been checked with experimental program on simply supported reinforced concrete beams, subjected to different stages of symmetric and un-symmetric damages. The results of the experiments are encouraging and show that both the methods can be adopted together in a damage identification scenario.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective : The main objective of this work was to study the antipyretic and antibacterial activity of C. erectus (Buch.-Ham.) Verdcourt leaf extract in an experimental albino rat model. Materials and Methods : The methanol extract of C. erectus leaf (MECEL) was evaluated for its antipyretic potential on normal body temperature and Brewers yeast-induced pyrexia in albino rats model. While the antibacterial activity of MECEL against five Gram (-) and three Gram () bacterial strains and antimycotic activity was investigated against four fungi using agar disk diffusion and microdilution methods. Result : Yeast suspension (10 mL/kg b.w.) elevated rectal temperature after 19 h of subcutaneous injection. Oral administration of MECEL at 100 and 200 mg/kg b.w. showed significant reduction of normal rectal body temperature and yeast-provoked elevated temperature (38.8 0.2 and 37.6 0.4, respectively, at 2-3 h) in a dose-dependent manner, and the effect was comparable to that of the standard antipyretic drug-paracetamol (150 mg/kg b.w.). MECEL at 2 mg/disk showed broad spectrum of growth inhibition activity against both groups of bacteria. However, MECEL was not effective against the yeast strains tested in this study. Conclusion : This study revealed that the methanol extract of C. erectus exhibited significant antipyretic activity in the tested models and antibacterial activity as well, and may provide the scientific rationale for its popular use as antipyretic agent in Khamptiss folk medicines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In meteorology, observations and forecasts of a wide range of phenomena for example, snow, clouds, hail, fog, and tornados can be categorical, that is, they can only have discrete values (e.g., "snow" and "no snow"). Concentrating on satellite-based snow and cloud analyses, this thesis explores methods that have been developed for evaluation of categorical products and analyses. Different algorithms for satellite products generate different results; sometimes the differences are subtle, sometimes all too visible. In addition to differences between algorithms, the satellite products are influenced by physical processes and conditions, such as diurnal and seasonal variation in solar radiation, topography, and land use. The analysis of satellite-based snow cover analyses from NOAA, NASA, and EUMETSAT, and snow analyses for numerical weather prediction models from FMI and ECMWF was complicated by the fact that we did not have the true knowledge of snow extent, and we were forced simply to measure the agreement between different products. The Sammon mapping, a multidimensional scaling method, was then used to visualize the differences between different products. The trustworthiness of the results for cloud analyses [EUMETSAT Meteorological Products Extraction Facility cloud mask (MPEF), together with the Nowcasting Satellite Application Facility (SAFNWC) cloud masks provided by Météo-France (SAFNWC/MSG) and the Swedish Meteorological and Hydrological Institute (SAFNWC/PPS)] compared with ceilometers of the Helsinki Testbed was estimated by constructing confidence intervals (CIs). Bootstrapping, a statistical resampling method, was used to construct CIs, especially in the presence of spatial and temporal correlation. The reference data for validation are constantly in short supply. In general, the needs of a particular project drive the requirements for evaluation, for example, for the accuracy and the timeliness of the particular data and methods. In this vein, we discuss tentatively how data provided by general public, e.g., photos shared on the Internet photo-sharing service Flickr, can be used as a new source for validation. Results show that they are of reasonable quality and their use for case studies can be warmly recommended. Last, the use of cluster analysis on meteorological in-situ measurements was explored. The Autoclass algorithm was used to construct compact representations of synoptic conditions of fog at Finnish airports.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objectives of this study were to make a detailed and systematic empirical analysis of microfinance borrowers and non-borrowers in Bangladesh and also examine how efficiency measures are influenced by the access to agricultural microfinance. In the empirical analysis, this study used both parametric and non-parametric frontier approaches to investigate differences in efficiency estimates between microfinance borrowers and non-borrowers. This thesis, based on five articles, applied data obtained from a survey of 360 farm households from north-central and north-western regions in Bangladesh. The methods used in this investigation involve stochastic frontier (SFA) and data envelopment analysis (DEA) in addition to sample selectivity and limited dependent variable models. In article I, technical efficiency (TE) estimation and identification of its determinants were performed by applying an extended Cobb-Douglas stochastic frontier production function. The results show that farm households had a mean TE of 83% with lower TE scores for the non-borrowers of agricultural microfinance. Addressing institutional policies regarding the consolidation of individual plots into farm units, ensuring access to microfinance, extension education for the farmers with longer farming experience are suggested to improve the TE of the farmers. In article II, the objective was to assess the effects of access to microfinance on household production and cost efficiency (CE) and to determine the efficiency differences between the microfinance participating and non-participating farms. In addition, a non-discretionary DEA model was applied to capture directly the influence of microfinance on farm households production and CE. The results suggested that under both pooled DEA models and non-discretionary DEA models, farmers with access to microfinance were significantly more efficient than their non-borrowing counterparts. Results also revealed that land fragmentation, family size, household wealth, on farm-training and off farm income share are the main determinants of inefficiency after effectively correcting for sample selection bias. In article III, the TE of traditional variety (TV) and high-yielding-variety (HYV) rice producers were estimated in addition to investigating the determinants of adoption rate of HYV rice. Furthermore, the role of TE as a potential determinant to explain the differences of adoption rate of HYV rice among the farmers was assessed. The results indicated that in spite of its much higher yield potential, HYV rice production was associated with lower TE and had a greater variability in yield. It was also found that TE had a significant positive influence on the adoption rates of HYV rice. In article IV, we estimated profit efficiency (PE) and profit-loss between microfinance borrowers and non-borrowers by a sample selection framework, which provided a general framework for testing and taking into account the sample selection in the stochastic (profit) frontier function analysis. After effectively correcting for selectivity bias, the mean PE of the microfinance borrowers and non-borrowers were estimated at 68% and 52% respectively. This suggested that a considerable share of profits were lost due to profit inefficiencies in rice production. The results also demonstrated that access to microfinance contributes significantly to increasing PE and reducing profit-loss per hectare land. In article V, the effects of credit constraints on TE, allocative efficiency (AE) and CE were assessed while adequately controlling for sample selection bias. The confidence intervals were determined by the bootstrap method for both samples. The results indicated that differences in average efficiency scores of credit constrained and unconstrained farms were not statistically significant although the average efficiencies tended to be higher in the group of unconstrained farms. After effectively correcting for selectivity bias, household experience, number of dependents, off-farm income, farm size, access to on farm training and yearly savings were found to be the main determinants of inefficiencies. In general, the results of the study revealed the existence substantial technical, allocative, economic inefficiencies and also considerable profit inefficiencies. The results of the study suggested the need to streamline agricultural microfinance by the microfinance institutions (MFIs), donor agencies and government at all tiers. Moreover, formulating policies that ensure greater access to agricultural microfinance to the smallholder farmers on a sustainable basis in the study areas to enhance productivity and efficiency has been recommended. Key Words: Technical, allocative, economic efficiency, DEA, Non-discretionary DEA, selection bias, bootstrapping, microfinance, Bangladesh.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we study representation of KL-divergence minimization, in the cases where integer sufficient statistics exists, using tools from polynomial algebra. We show that the estimation of parametric statistical models in this case can be transformed to solving a system of polynomial equations. In particular, we also study the case of Kullback-Csiszar iteration scheme. We present implicit descriptions of these models and show that implicitization preserves specialization of prior distribution. This result leads us to a Grobner bases method to compute an implicit representation of minimum KL-divergence models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is a review of the measurement of I If noise in certain classes of materials which have a wide range of potential applications. This includes metal films, semi-conductors, metallic oxides and inhomogeneous systems such as composites. The review contains a basic introduction to this field, the theories and models and follows it up with a discussion on measurement methods. There are discussions on specific examples of the application of noise spectroscopy in the field of materials science. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Levamisole, an imidazo(2,1-b) thiazole derivative, has been reported to be a potential antitumor agent. In the present study, we have investigated the mechanism of action of one of the recently identified analogues, 4a (2-benzyl-6-(4'-fluorophenyl)-5-thiocyanato-imidazo2,1-b]1,3,4]thi adiazole). Materials and Methods: ROS production and expression of various apoptotic proteins were measured following 4a treatment in leukemia cell lines. Tumor animal models were used to evaluate the effect of 4a in comparison with Levamisole on progression of breast adenocarcinoma and survival. Immunohistochemistry and western blotting studies were performed to understand the mechanism of 4a action both ex vivo and in vivo. Results: We have determined the IC50 value of 4a in many leukemic and breast cancer cell lines and found CEM cells most sensitive (IC50 5 mu M). Results showed that 4a treatment leads to the accumulation of ROS. Western blot analysis showed upregulation of pro-apoptotic proteins t-BID and BAX, upon treatment with 4a. Besides, dose-dependent activation of p53 along with FAS, FAS-L, and cleavage of CASPASE-8 suggest that it induces death receptor mediated apoptotic pathway in CEM cells. More importantly, we observed a reduction in tumor growth and significant increase in survival upon oral administration of 4a (20 mg/kg, six doses) in mice. In comparison, 4a was found to be more potent than its parental analogue Levamisole based on both ex vivo and in vivo studies. Further, immunohistochemistry and western blotting studies indicate that 4a treatment led to abrogation of tumor cell proliferation and activation of apoptosis by the extrinsic pathway even in animal models. Conclusion: Thus, our results suggest that 4a could be used as a potent chemotherapeutic agent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

N-gram language models and lexicon-based word-recognition are popular methods in the literature to improve recognition accuracies of online and offline handwritten data. However, there are very few works that deal with application of these techniques on online Tamil handwritten data. In this paper, we explore methods of developing symbol-level language models and a lexicon from a large Tamil text corpus and their application to improving symbol and word recognition accuracies. On a test database of around 2000 words, we find that bigram language models improve symbol (3%) and word recognition (8%) accuracies and while lexicon methods offer much greater improvements (30%) in terms of word recognition, there is a large dependency on choosing the right lexicon. For comparison to lexicon and language model based methods, we have also explored re-evaluation techniques which involve the use of expert classifiers to improve symbol and word recognition accuracies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to enable seamless transformation of product concepts to CAD models. This necessitates availability of 3D product sketches. The present work concerns intuitive generation of 3D strokes and intrinsic support for space sharing and articulation for the components of the product being sketched. Direct creation of 3D strokes in air lacks in precision, stability and control. The inadequacy of proprioceptive feedback for the task is complimented in this work with stereo vision and haptics. Three novel methods based on pencil-paper interaction analogy for haptic rendering of strokes have been investigated. The pen-tilt based rendering is simpler and found to be more effective. For the spatial conformity, two modes of constraints for the stylus movements, corresponding to the motions on a control surface and in a control volume have been studied using novel reactive and field based haptic rendering schemes. The field based haptics, which in effect creates an attractive force field near a surface, though non-realistic, provided highly effective support for the control-surface constraints. The efficacy of the reactive haptic rendering scheme for the constrained environments has been demonstrated using scribble strokes. This can enable distributed collaborative 3D concept development. The notion of motion constraints, defined through sketch strokes enables intuitive generation of articulated 3D sketches and direct exploration of motion annotations found in most product concepts. The work, thus, establishes that modeling of the constraints is a central issue in 3D sketching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The formulation of higher order structural models and their discretization using the finite element method is difficult owing to their complexity, especially in the presence of non-linearities. In this work a new algorithm for automating the formulation and assembly of hyperelastic higher-order structural finite elements is developed. A hierarchic series of kinematic models is proposed for modeling structures with special geometries and the algorithm is formulated to automate the study of this class of higher order structural models. The algorithm developed in this work sidesteps the need for an explicit derivation of the governing equations for the individual kinematic modes. Using a novel procedure involving a nodal degree-of-freedom based automatic assembly algorithm, automatic differentiation and higher dimensional quadrature, the relevant finite element matrices are directly computed from the variational statement of elasticity and the higher order kinematic model. Another significant feature of the proposed algorithm is that natural boundary conditions are implicitly handled for arbitrary higher order kinematic models. The validity algorithm is illustrated with examples involving linear elasticity and hyperelasticity. (C) 2013 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ice volume estimates are crucial for assessing water reserves stored in glaciers. Due to its large glacier coverage, such estimates are of particular interest for the Himalayan-Karakoram (HK) region. In this study, different existing methodologies are used to estimate the ice reserves: three area-volume relations, one slope-dependent volume estimation method, and two ice-thickness distribution models are applied to a recent, detailed, and complete glacier inventory of the HK region, spanning over the period 2000-2010 and revealing an ice coverage of 40 775 km(2). An uncertainty and sensitivity assessment is performed to investigate the influence of the observed glacier area and important model parameters on the resulting total ice volume. Results of the two ice-thickness distribution models are validated with local ice-thickness measurements at six glaciers. The resulting ice volumes for the entire HK region range from 2955 to 4737 km(3), depending on the approach. This range is lower than most previous estimates. Results from the ice thickness distribution models and the slope-dependent thickness estimations agree well with measured local ice thicknesses. However, total volume estimates from area-related relations are larger than those from other approaches. The study provides evidence on the significant effect of the selected method on results and underlines the importance of a careful and critical evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aerosol loading over the South Asian region has the potential to affect the monsoon rainfall, Himalayan glaciers and regional air-quality, with implications for the billions in this region. While field campaigns and network observations provide primary data, they tend to be location/season specific. Numerical models are useful to regionalize such location-specific data. Studies have shown that numerical models underestimate the aerosol scenario over the Indian region, mainly due to shortcomings related to meteorology and the emission inventories used. In this context, we have evaluated the performance of two such chemistry-transport models: WRF-Chem and SPRINTARS over an India-centric domain. The models differ in many aspects including physical domain, horizontal resolution, meteorological forcing and so on etc. Despite these differences, both the models simulated similar spatial patterns of Black Carbon (BC) mass concentration, (with a spatial correlation of 0.9 with each other), and a reasonable estimates of its concentration, though both of them under-estimated vis-a-vis the observations. While the emissions are lower (higher) in SPRINTARS (WRF-Chem), overestimation of wind parameters in WRF-Chem caused the concentration to be similar in both models. Additionally, we quantified the under-estimations of anthropogenic BC emissions in the inventories used these two models and three other widely used emission inventories. Our analysis indicates that all these emission inventories underestimate the emissions of BC over India by a factor that ranges from 1.5 to 2.9. We have also studied the model simulations of aerosol optical depth over the Indian region. The models differ significantly in simulations of AOD, with WRF-Chem having a better agreement with satellite observations of AOD as far as the spatial pattern is concerned. It is important to note that in addition to BC, dust can also contribute significantly to AOD. The models differ in simulations of the spatial pattern of mineral dust over the Indian region. We find that both meteorological forcing and emission formulation contribute to these differences. Since AOD is column integrated parameter, description of vertical profiles in both models, especially since elevated aerosol layers are often observed over Indian region, could be also a contributing factor. Additionally, differences in the prescription of the optical properties of BC between the models appear to affect the AOD simulations. We also compared simulation of sea-salt concentration in the two models and found that WRF-Chem underestimated its concentration vis-a-vis SPRINTARS. The differences in near-surface oceanic wind speeds appear to be the main source of this difference. In-spite of these differences, we note that there are similarities in their simulation of spatial patterns of various aerosol species (with each other and with observations) and hence models could be valuable tools for aerosol-related studies over the Indian region. Better estimation of emission inventories could improve aerosol-related simulations. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper intends to provide an overview of the rich legacy of models and theories that have emerged in the last fifty years of the relatively young discipline of design research, and identifies some of the major areas of further research. It addresses the following questions: What are the major theories and models of design? How are design theory and model defined, and what is their purpose? What are the criteria they must satisfy to be considered a design theory or model? How should a theory or model of design be evaluated or validated? What are the major directions for further research?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The solubilities of two lipid derivatives, geranyl butyrate and 10-undecen-1-ol, in SCCO2 (supercritical carbon dioxide) were measured at different operating conditions of temperature (308.15 to 333.15 K) and pressure (10 to 18 MPa). The solubilities (in mole fraction) ranged from 2.1 x 10(-3) to 23.2 x 10(-3) for geranyl butyrate and 2.2 x 10(-3) to 25.0 x 10(-3) for 10-undecen-1-ol, respectively. The solubility data showed a retrograde behavior in the pressure and temperature range investigated. Various combinations of association and solution theory along with different activity coefficient models were developed. The experimental data for the solubilities of 21 liquid solutes along with geranyl butyrate and 10-undecen-1-ol were correlated using both the newly derived models and the existing models. The average deviation of the correlation of the new models was below 15%.