941 resultados para Cadastral updating


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research concerns the development and application of an analytical computer program, SAFE-ROC, that models material behaviour and structural behaviour of a slender reinforced concrete column that is part of an overall structure and is subjected to elevated temperatures as a result of exposure to fire. The analysis approach used in SAFE-RCC is non-linear. Computer calculations are used that take account of restraint and continuity, and the interaction of the column with the surrounding structure during the fire. Within a given time step an iterative approach is used to find a deformed shape for the column which results in equilibrium between the forces associated with the external loads and internal stresses and degradation. Non-linear geometric effects are taken into account by updating the geometry of the structure during deformation. The structural response program SAFE-ROC includes a total strain model which takes account of the compatibility of strain due to temperature and loading. The total strain model represents a constitutive law that governs the material behaviour for concrete and steel. The material behaviour models employed for concrete and steel take account of the dimensional changes caused by the temperature differentials and changes in the material mechanical properties with changes in temperature. Non-linear stress-strain laws are used that take account of loading to a strain greater than that corresponding to the peak stress of the concrete stress-strain relation, and model the inelastic deformation associated with unloading of the steel stress-strain relation. The cross section temperatures caused by the fire environment are obtained by a preceding non-linear thermal analysis, a computer program FIRES-T.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classification of metamorphic rocks is normally carried out using a poorly defined, subjective classification scheme making this an area in which many undergraduate geologists experience difficulties. An expert system to assist in such classification is presented which is capable of classifying rocks and also giving further details about a particular rock type. A mixed knowledge representation is used with frame, semantic and production rule systems available. Classification in the domain requires that different facets of a rock be classified. To implement this, rocks are represented by 'context' frames with slots representing each facet. Slots are satisfied by calling a pre-defined ruleset to carry out the necessary inference. The inference is handled by an interpreter which uses a dependency graph representation for the propagation of evidence. Uncertainty is handled by the system using a combination of the MYCIN certainty factor system and the Dempster-Shafer range mechanism. This allows for positive and negative reasoning, with rules capable of representing necessity and sufficiency of evidence, whilst also allowing the implementation of an alpha-beta pruning algorithm to guide question selection during inference. The system also utilizes a semantic net type structure to allow the expert to encode simple relationships between terms enabling rules to be written with a sensible level of abstraction. Using frames to represent rock types where subclassification is possible allows the knowledge base to be built in a modular fashion with subclassification frames only defined once the higher level of classification is functioning. Rulesets can similarly be added in modular fashion with the individual rules being essentially declarative allowing for simple updating and maintenance. The knowledge base so far developed for metamorphic classification serves to demonstrate the performance of the interpreter design whilst also moving some way towards providing a useful assistant to the non-expert metamorphic petrologist. The system demonstrates the possibilities for a fully developed knowledge base to handle the classification of igneous, sedimentary and metamorphic rocks. The current knowledge base and interpreter have been evaluated by potential users and experts. The results of the evaluation show that the system performs to an acceptable level and should be of use as a tool for both undergraduates and researchers from outside the metamorphic petrography field. .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current study examined the role of executive function in retrieval of specific autobiographical memories in older adults with regard to control of emotion during retrieval. Older and younger adults retrieved memories of specific events in response to emotionally positive, negative and neutral word cues. Contributions of inhibitory and updating elements of executive function to variance in autobiographical specificity were assessed to determine processes involved in the commonly found age-related reduction in specificity. A negative relationship between age and specificity was only found in retrieval to neutral cues. Alternative explanations of this age preservation of specificity of emotional recall are explored, within the context of control of emotion in the self-memory system and preserved emotional processing and positivity effect in older adults. The pattern of relationships suggests updating, rather than inhibition as the source of age-related reduction in specificity, but that emotional processing (particularly of positively valenced memories) is not influenced by age-related variance in executive control. The tendency of older adults to focus on positive material may thus act as a buffer against detrimental effects of reduced executive function capacity on autobiographical retrieval, representing a possible target for interventions to improve specificity of autobiographical memory retrieval in older adults.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Requirements-aware systems address the need to reason about uncertainty at runtime to support adaptation decisions, by representing quality of services (QoS) requirements for service-based systems (SBS) with precise values in run-time queryable model specification. However, current approaches do not support updating of the specification to reflect changes in the service market, like newly available services or improved QoS of existing ones. Thus, even if the specification models reflect design-time acceptable requirements they may become obsolete and miss opportunities for system improvement by self-adaptation. This articles proposes to distinguish "abstract" and "concrete" specification models: the former consists of linguistic variables (e.g. "fast") agreed upon at design time, and the latter consists of precise numeric values (e.g. "2ms") that are dynamically calculated at run-time, thus incorporating up-to-date QoS information. If and when freshly calculated concrete specifications are not satisfied anymore by the current service configuration, an adaptation is triggered. The approach was validated using four simulated SBS that use services from a previously published, real-world dataset; in all cases, the system was able to detect unsatisfied requirements at run-time and trigger suitable adaptations. Ongoing work focuses on policies to determine recalculation of specifications. This approach will allow engineers to build SBS that can be protected against market-caused obsolescence of their requirements specifications. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the problem of obtaining a dense reconstruction in real-time, from a live video stream. In recent years, multi-view stereo (MVS) has received considerable attention and a number of methods have been proposed. However, most methods operate under the assumption of a relatively sparse set of still images as input and unlimited computation time. Video based MVS has received less attention despite the fact that video sequences offer significant benefits in terms of usability of MVS systems. In this paper we propose a novel video based MVS algorithm that is suitable for real-time, interactive 3d modeling with a hand-held camera. The key idea is a per-pixel, probabilistic depth estimation scheme that updates posterior depth distributions with every new frame. The current implementation is capable of updating 15 million distributions/s. We evaluate the proposed method against the state-of-the-art real-time MVS method and show improvement in terms of accuracy. © 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method for selecting a suitable subspace for discriminating signal components through an oblique projection is proposed. The selection criterion is based on the consistency principle introduced by Unser and Aldroubi and extended by Elder. An effective implementation of this principle for the purpose of subspace selection is achieved by updating of the dual vectors yielding the corresponding oblique projector. © 2007 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projects exposed to an uncertain environment must be adapted to deal with the effective integration of various planning elements and the optimization of project parameters. Time, cost, and quality are the prime objectives of a project that need to be optimized to fulfill the owner's goal. In an uncertain environment, there exist many other conflicting objectives that may also need to be optimized. These objectives are characterized by varying degrees of conflict. Moreover, an uncertain environment also causes several changes in the project plan throughout its life, demanding that the project plan be totally flexible. Goal programming (GP), a multiple criteria decision making technique, offers a good solution for this project planning problem. There the planning problem is considered from the owner's perspective, which leads to classifying the project up to the activity level. GP is applied separately at each level, and the formulated models are integrated through information flow. The flexibility and adaptability of the models lies in the ease of updating the model parameters at the required level through changing priorities and/or constraints and transmitting the information to other levels. The hierarchical model automatically provides integration among various element of planning. The proposed methodology is applied in this paper to plan a petroleum pipeline construction project, and its effectiveness is demonstrated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we present a data structure which improves the average complexity of the operations of updating and a certain type of retrieving information on an array. The data structure is devised from a particular family of digraphs verifying conditions so that they represent solutions for this problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper RDPPLan, a model for planning with quantitative resources specified as numerical intervals, is presented. Nearly all existing models of planning with resources require to specify exact values for updating resources modified by actions execution. In other words these models cannot deal with more realistic situations in which the resources quantities are not completely known but are bounded by intervals. The RDPPlan model allow to manage domains more tailored to real world, where preconditions and effects over quantitative resources can be specified by intervals of values, in addition mixed logical/quantitative and pure numerical goals can be posed. RDPPlan is based on non directional search over a planning graph, like DPPlan, from which it derives, it uses propagation rules which have been appropriately extended to the management of resource intervals. The propagation rules extended with resources must verify invariant properties over the planning graph which have been proven by the authors and guarantee the correctness of the approach. An implementation of the RDPPlan model is described with search strategies specifically developed for interval resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household's evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household's optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The first essay developed a respondent model of Bayesian updating for a double-bound dichotomous choice (DB-DC) contingent valuation methodology. I demonstrated by way of data simulations that current DB-DC identifications of true willingness-to-pay (WTP) may often fail given this respondent Bayesian updating context. Further simulations demonstrated that a simple extension of current DB-DC identifications derived explicitly from the Bayesian updating behavioral model can correct for much of the WTP bias. Additional results provided caution to viewing respondents as acting strategically toward the second bid. Finally, an empirical application confirmed the simulation outcomes. The second essay applied a hedonic property value model to a unique water quality (WQ) dataset for a year-round, urban, and coastal housing market in South Florida, and found evidence that various WQ measures affect waterfront housing prices in this setting. However, the results indicated that this relationship is not consistent across any of the six particular WQ variables used, and is furthermore dependent upon the specific descriptive statistic employed to represent the WQ measure in the empirical analysis. These results continue to underscore the need to better understand both the WQ measure and its statistical form homebuyers use in making their purchase decision. The third essay addressed a limitation to existing hurricane evacuation modeling aspects by developing a dynamic model of hurricane evacuation behavior. A household’s evacuation decision was framed as an optimal stopping problem where every potential evacuation time period prior to the actual hurricane landfall, the household’s optimal choice is to either evacuate, or to wait one more time period for a revised hurricane forecast. A hypothetical two-period model of evacuation and a realistic multi-period model of evacuation that incorporates actual forecast and evacuation cost data for my designated Gulf of Mexico region were developed for the dynamic analysis. Results from the multi-period model were calibrated with existing evacuation timing data from a number of hurricanes. Given the calibrated dynamic framework, a number of policy questions that plausibly affect the timing of household evacuations were analyzed, and a deeper understanding of existing empirical outcomes in regard to the timing of the evacuation decision was achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oxygen and carbon isotopic data were produced on the benthic foraminiferal taxa Cibicidoides and Planulina from 25 new piston cores, gravity cores, and multicores from the Brazil margin. The cores span water depths from about 400 to 3000 m and intersect the major water masses in this region. These new data fill a critical gap in the South Atlantic Ocean and provide the motivation for updating the classic glacial western Atlantic d13C transect of Duplessy et al. (1988). The distribution of 13C of SumCO2 requires the presence of three distinct water masses in the glacial Atlantic Ocean: a shallow (~1000 m), southern source water mass with an end-member d13C value of about 0.3-0.5 per mil VPDB, a middepth (~1500 m), northern source water mass with an end-member value of about 1.5 per mil, and a deep (>2000 m), southern source water with an end-member value of less than -0.2 per mil, and perhaps as low as the -0.9 per mil values observed in the South Atlantic sector of the Southern Ocean (Ninnemann and Charles, 2002, doi:10.1016/S0012-821X(02)00708-2). The origins of the water masses are supported by the meridional gradients in benthic foraminiferal d18O. A revised glacial section of deep water d13C documents the positions and gradients among these end-member intermediate and deep water masses. The large property gradients in the presence of strong vertical mixing can only be maintained by a vigorous overturning circulation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energy is a vital resource for social and economic development. In the present scenario, the search for alternative energy sources has become fundamental, especially after the oil crises between 1973 and 1979, the Chernobyl nuclear accident in 1986 and the Kyoto Protocol in 1997. The demand for the development of new alternative energy sources aims to complement existing forms allows to meet the demand for energy consumption with greater security. Brazil, with the guideline of not dirtying the energy matrix by the fossil fuels exploitation and the recent energy crisis caused by the lack of rains, directs energy policies for the development of other renewable energy sources, complementing the hydric. This country is one of the countries that stand out for power generation capacity from the winds in several areas, especially Rio Grande do Norte (RN), which is one of the states with highest installed power and great potential to be explored. In this context arises the purpose of this work to identify the incentive to develop policies of wind energy in Rio Grande do Norte. The study was conducted by a qualitative methodology of data analysis called content analysis, oriented for towards message characteristics, its informational value, the words, arguments and ideas expressed in it, constituting a thematic analysis. To collect the data interviews were conducted with managers of major organizations related to wind energy in Brazil and in the state of Rio Grande do Norte. The identification of incentive policies was achieved in three stages: the first seeking incentives policies in national terms, which are applied to all states, the second with the questionnaire application and the third to research and data collection for the development of the installed power of the RN as compared to other states. At the end, the results demonstrated hat in Rio Grande do Norte state there is no incentive policy for the development of wind power set and consolidated, specific actions in order to optimize the bureaucratic issues related to wind farms, especially on environmental issues. The absence of this policy hinders the development of wind energy RN, considering result in reduced competitiveness and performance in recent energy auctions. Among the perceived obstacles include the lack of hand labor sufficient to achieve the reporting and analysis of environmental licenses, the lack of updating the wind Atlas of the state, a shortfall of tax incentives. Added to these difficulties excel barriers in infrastructure and logistics, with the lack of a suitable port for large loads and the need for reform, maintenance and duplication of roads and highways that are still loss-making. It is suggested as future work the relationship of the technology park of energy and the development of wind power in the state, the influence of the technology park to attract businesses and industries in the wind sector to settle in RN and a comparison of incentive policies to development of wind energy in the Brazilian states observing wind development in the same states under study.