849 resultados para BIM (Building Information Modeling)
Resumo:
Layering is a widely used method for structuring data in CAD-models. During the last few years national standardisation organisations, professional associations, user groups for particular CAD-systems, individual companies etc. have issued numerous standards and guidelines for the naming and structuring of layers in building design. In order to increase the integration of CAD data in the industry as a whole ISO recently decided to define an international standard for layer usage. The resulting standard proposal, ISO 13567, is a rather complex framework standard which strives to be more of a union than the least common denominator of the capabilities of existing guidelines. A number of principles have been followed in the design of the proposal. The first one is the separation of the conceptual organisation of information (semantics) from the way this information is coded (syntax). The second one is orthogonality - the fact that many ways of classifying information are independent of each other and can be applied in combinations. The third overriding principle is the reuse of existing national or international standards whenever appropriate. The fourth principle allows users to apply well-defined subsets of the overall superset of possible layernames. This article describes the semantic organisation of the standard proposal as well as its default syntax. Important information categories deal with the party responsible for the information, the type of building element shown, whether a layer contains the direct graphical description of a building part or additional information needed in an output drawing etc. Non-mandatory information categories facilitate the structuring of information in rebuilding projects, use of layers for spatial grouping in large multi-storey projects, and storing multiple representations intended for different drawing scales in the same model. Pilot testing of ISO 13567 is currently being carried out in a number of countries which have been involved in the definition of the standard. In the article two implementations, which have been carried out independently in Sweden and Finland, are described. The article concludes with a discussion of the benefits and possible drawbacks of the standard. Incremental development within the industry, (where ”best practice” can become ”common practice” via a standard such as ISO 13567), is contrasted with the more idealistic scenario of building product models. The relationship between CAD-layering, document management product modelling and building element classification is also discussed.
Resumo:
One of the central issues in making efficient use of IT in the design, construction and maintenance of buildings is the sharing of the digital building data across disciplines and lifecycle stages. One technology which enables data sharing is CAD layering, which to be of real use requires the definition of standards. This paper focuses on the background, objectives and effectiveness of the International standard ISO 13567, Organisation and naming of layers for CAD. In particular the efficiency and effectiveness of the standardisation and standard implementation process are in focus, rather than the technical details. The study was conducted as a qualitative study with a number of experts who responded to a semi-structured mail questionnaire, supplemented by personal interviews. The main results were that CAD layer standards based on the ISO standard have been implemented, particularly in northern European countries, but are not very widely used. A major problem which was identified was the lack of resources for marketing and implementing the standard as national variations, once it had been formally accepted.
Resumo:
A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.
Resumo:
A functioning stock market is an essential component of a competitive economy, since it provides a mechanism for allocating the economy’s capital stock. In an ideal situation, the stock market will steer capital in a manner that maximizes the total utility of the economy. As prices of traded stocks depend on and vary with information available to investors, it is apparent that information plays a crucial role in a functioning stock market. However, even though information indisputably matters, several issues regarding how stock markets process and react to new information still remain unanswered. The purpose of this thesis is to explore the link between new information and stock market reactions. The first essay utilizes new methodological tools in order to investigate the average reaction of investors to new financial statement information. The second essay explores the behavior of different types of investors when new financial statement information is disclosed to the market. The third essay looks into the interrelation between investor size, behavior and overconfidence. The fourth essay approaches the puzzle of negative skewness in stock returns from an altogether different angle than previous studies. The first essay presents evidence of the second derivatives of some financial statement signals containing more information than the first derivatives. Further, empirical evidence also indicates that some of the investigated signals proxy risk while others contain information priced with a delay. The second essay documents different categories of investors demonstrating systematical differences in their behavior when new financial statement information arrives to the market. In addition, a theoretical model building on differences in investor overconfidence is put forward in order to explain the observed behavior. The third essay shows that investor size describes investor behavior very well. This finding is predicted by the model proposed in the second essay, and hence strengthens the model. The behavioral differences between investors of different size furthermore have significant economic implications. Finally, the fourth essay finds strong evidence of management news disclosure practices causing negative skewness in stock returns.
Resumo:
Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.
Resumo:
In this thesis we deal with the concept of risk. The objective is to bring together and conclude on some normative information regarding quantitative portfolio management and risk assessment. The first essay concentrates on return dependency. We propose an algorithm for classifying markets into rising and falling. Given the algorithm, we derive a statistic: the Trend Switch Probability, for detection of long-term return dependency in the first moment. The empirical results suggest that the Trend Switch Probability is robust over various volatility specifications. The serial dependency in bear and bull markets behaves however differently. It is strongly positive in rising market whereas in bear markets it is closer to a random walk. Realized volatility, a technique for estimating volatility from high frequency data, is investigated in essays two and three. In the second essay we find, when measuring realized variance on a set of German stocks, that the second moment dependency structure is highly unstable and changes randomly. Results also suggest that volatility is non-stationary from time to time. In the third essay we examine the impact from market microstructure on the error between estimated realized volatility and the volatility of the underlying process. With simulation-based techniques we show that autocorrelation in returns leads to biased variance estimates and that lower sampling frequency and non-constant volatility increases the error variation between the estimated variance and the variance of the underlying process. From these essays we can conclude that volatility is not easily estimated, even from high frequency data. It is neither very well behaved in terms of stability nor dependency over time. Based on these observations, we would recommend the use of simple, transparent methods that are likely to be more robust over differing volatility regimes than models with a complex parameter universe. In analyzing long-term return dependency in the first moment we find that the Trend Switch Probability is a robust estimator. This is an interesting area for further research, with important implications for active asset allocation.
Resumo:
The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.
Resumo:
The dissertation examines the foreign policies of the United States through the prism of science and technology. In the focal point of scrutiny is the policy establishing the International Institute for Applied Systems Analysis (IIASA) and the development of the multilateral part of bridge building in American foreign policy during the 1960s and early 1970s. After a long and arduous negotiation process, the institute was finally established by twelve national member organizations from the following countries: Bulgaria, Canada, Czechoslovakia, Federal Republic of Germany (FRG), France, German Democratic Republic (GDR), Great Britain, Italy, Japan, Poland, Soviet Union and United States; a few years later Sweden, Finland and the Netherlands also joined. It is said that the goal of the institute was to bring together researchers from East and West to solve pertinent problems caused by the modernization process experienced in industrialized world. It originates from President Lyndon B. Johnson s bridge building policies that were launched in 1964, and was set in a well-contested and crowded domain of other international organizations of environmental and social planning. Since the distinct need for yet another organization was not evident, the process of negotiations in this multinational environment enlightens the foreign policy ambitions of the United States on the road to the Cold War détente. The study places this project within its political era, and juxtaposes it with other international organizations, especially that of the OECD, ECE and NATO. Conventionally, Lyndon Johnson s bridge building policies have been seen as a means to normalize its international relations bilaterally with different East European countries, and the multilateral dimension of the policy has been ignored. This is why IIASA s establishment process in this multilateral environment brings forth new information on US foreign policy goals, the means to achieve these goals, as well as its relations to other advanced industrialized societies before the time of détente, during the 1960s and early 1970s. Furthermore, the substance of the institute applied systems analysis illuminates the differences between European and American methodological thinking in social planning. Systems analysis is closely associated with (American) science and technology policies of the 1960s, especially in its military administrative applications, thus analysis within the foreign policy environment of the United States proved particularly fruitful. In the 1960s the institutional structures of European continent with faltering, and the growing tendencies of integration were in flux. One example of this was the long, drawn-out process of British membership in the EEC, another is de Gaulle s withdrawal from NATO s military-political cooperation. On the other hand, however, economic cooperation in Europe between East and West, and especially with the Soviet Union was expanding rapidly. This American initiative to form a new institutional actor has to be seen in that structural context, showing that bridge building was needed not only to the East, but also to the West. The narrative amounts to an analysis of how the United States managed both cooperation and conflict in its hegemonic aspirations in the emerging modern world, and how it used its special relationship with the United Kingdom to achieve its goals. The research is based on the archives of the United States, Great Britain, Sweden, Finland, and IIASA. The primary sources have been complemented with both contemporary and present day research literature, periodicals, and interviews.
Resumo:
Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
There are numerous formats for writing spellcheckers for open-source systems and there are many descriptions for languages written in these formats. Similarly, for word hyphenation by computer there are TEX rules for many languages. In this paper we demonstrate a method for converting these spell-checking lexicons and hyphenation rule sets into finite-state automata, and present a new finite-state based system for writer’s tools used in current open-source software such as Firefox, OpenOffice.org and enchant via the spell-checking library voikko.
Resumo:
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying dynamics are used to store and associatively recall information, are described. In the first class of models, a hierarchical structure is used to store an exponentially large number of strongly correlated memories. The second class of models uses limit cycles to store and retrieve individual memories. A neurobiologically plausible network that generates low-amplitude periodic variations of activity, similar to the oscillations observed in electroencephalographic recordings, is also described. Results obtained from analytic and numerical studies of the properties of these networks are discussed.
Resumo:
Bacteriorhodopsin has been the subject of intense study in order to understand its photochemical function. The recent atomic model proposed by Henderson and coworkers based on electron cryo-microscopic studies has helped in understanding many of the structural and functional aspects of bacteriorhodopsin. However, the accuracy of the positions of the side chains is not very high since the model is based on low-resolution data. In this study, we have minimized the energy of this structure of bacteriorhodopsin and analyzed various types of interactions such as - intrahelical and interhelical hydrogen bonds and retinal environment. In order to understand the photochemical action, it is necessary to obtain information on the structures adopted at the intermediate states. In this direction, we have generated some intermediate structures taking into account certain experimental data, by computer modeling studies. Various isomers of retinal with 13-cis and/or 15-cis conformations and all possible staggered orientations of Lys-216 side chain were generated. The resultant structures were examined for the distance between Lys-216-schiff base nitrogen and the carboxylate oxygen atoms of Asp-96 - a residue which is known to reprotonate the schiff base at later stages of photocycle. Some of the structures were selected on the basis of suitable retinal orientation and the stability of these structures were tested by energy minimization studies. Further, the minimized structures are analyzed for the hydrogen bond interactions and retinal environment and the results are compared with those of the minimized rest state structure. The importance of functional groups in stabilizing the structure of bacteriorhodopsin and in participating dynamically during the photocycle have been discussed.
Resumo:
Background: Sensitive remote homology detection and accurate alignments especially in the midnight zone of sequence similarity are needed for better function annotation and structural modeling of proteins. An algorithm, AlignHUSH for HMM-HMM alignment has been developed which is capable of recognizing distantly related domain families The method uses structural information, in the form of predicted secondary structure probabilities, and hydrophobicity of amino acids to align HMMs of two sets of aligned sequences. The effect of using adjoining column(s) information has also been investigated and is found to increase the sensitivity of HMM-HMM alignments and remote homology detection. Results: We have assessed the performance of AlignHUSH using known evolutionary relationships available in SCOP. AlignHUSH performs better than the best HMM-HMM alignment methods and is observed to be even more sensitive at higher error rates. Accuracy of the alignments obtained using AlignHUSH has been assessed using the structure-based alignments available in BaliBASE. The alignment length and the alignment quality are found to be appropriate for homology modeling and function annotation. The alignment accuracy is found to be comparable to existing methods for profile-profile alignments. Conclusions: A new method to align HMMs has been developed and is shown to have better sensitivity at error rates of 10% and above when compared to other available programs. The proposed method could effectively aid obtaining clues to functions of proteins of yet unknown function. A web-server incorporating the AlignHUSH method is available at http://crick.mbu.iisc.ernet.in/similar to alignhush/
Resumo:
Rapid urbanisation in India has posed serious challenges to the decision makers in regional planning involving plethora of issues including provision of basic amenities (like electricity, water, sanitation, transport, etc.). Urban planning entails an understanding of landscape and urban dynamics with causal factors. Identifying, delineating and mapping landscapes on temporal scale provide an opportunity to monitor the changes, which is important for natural resource management and sustainable planning activities. Multi-source, multi-sensor, multi-temporal, multi-frequency or multi-polarization remote sensing data with efficient classification algorithms and pattern recognition techniques aid in capturing these dynamics. This paper analyses the landscape dynamics of Greater Bangalore by: (i) characterisation of direct impervious surface, (ii) computation of forest fragmentation indices and (iii) modeling to quantify and categorise urban changes. Linear unmixing is used for solving the mixed pixel problem of coarse resolution super spectral MODIS data for impervious surface characterisation. Fragmentation indices were used to classify forests – interior, perforated, edge, transitional, patch and undetermined. Based on this, urban growth model was developed to determine the type of urban growth – Infill, Expansion and Outlying growth. This helped in visualising urban growth poles and consequence of earlier policy decisions that can help in evolving strategies for effective land use policies.