17 resultados para Model information
em Helda - Digital Repository of University of Helsinki
Resumo:
A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
Information structure and Kabyle constructions Three sentence types in the Construction Grammar framework The study examines three Kabyle sentence types and their variants. These sentence types have been chosen because they code the same state of affairs but have different syntactic structures. The sentence types are Dislocated sentence, Cleft sentence, and Canonical sentence. I argue first that a proper description of these sentence types should include information structure and, second, that a description which takes into account information structure is possible in the Construction Grammar framework. The study thus constitutes a testing ground for Construction Grammar for its applicability to a less known language. It constitutes a testing ground notably because the differentiation between the three types of sentences cannot be done without information structure categories and, consequently, these categories must be integrated also in the grammatical description. The information structure analysis is based on the model outlined by Knud Lambrecht. In that model, information structure is considered as a component of sentence grammar that assures the pragmatically correct sentence forms. The work starts by an examination of the three sentence types and the analyses that have been done in André Martinet s functional grammar framework. This introduces the sentence types chosen as the object of study and discusses the difficulties related to their analysis. After a presentation of the state of the art, including earlier and more recent models, the principles and notions of Construction Grammar and of Lambrecht s model are introduced and explicated. The information structure analysis is presented in three chapters, each treating one of the three sentence types. The analyses are based on spoken language data and elicitation. Prosody is included in the study when a syntactic structure seems to code two different focus structures. In such cases, it is pertinent to investigate whether these are coded by prosody. The final chapter presents the constructions that have been established and the problems encountered in analysing them. It also discusses the impact of the study on the theories used and on the theory of syntax in general.
Resumo:
Investors significantly overweight domestic assets in their portfolios. This behavior which is commonly called “home bias” contradicts the prescriptions of portfolio theory. This thesis explores potential reasons for the “home bias” by examining the characteristics of the investing and the target countries and features of the interaction between them. A common theme of the four essays is a focus on the importance of information about foreign markets in explaining the share of these markets in investors’ portfolios. The results indicate that the size of the equity ownership in another country strongly relates to the distance to the financial capital of that country, and to trade in goods with and direct investments (FDI) to that country. The first essay empirically investigates the relationship between trade in real goods and portfolio investments. Overall, the evidence indicates a substantial role for trade in reducing the information cost relating to portfolio investments. The second essay examines the implications of the launch of the European Monetary Union (EMU) on international portfolio investments. The evidence on the allocation of Finnish international portfolio investments is more consistent with an information-based than a diversification motive explanation. The third essay employs new data for a large number of countries and further explores the role of trade on international portfolio investments. The results indicate that trade provides important information especially on firms in countries in which the corporate governance structure and the information environment of firms generate less reliable information. The fourth essay examines the relationship between direct investments (FDI) and portfolio investments. In contrast to the predications of portfolio theory, it provides evidence that FDI is a complement rather than a substitute for portfolio investments.
Resumo:
The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.
Resumo:
Parkinson’s disease (PD) is the second most common neurodegenerative disease among the elderly. Its etiology is unknown and no disease-modifying drugs are available. Thus, more information concerning its pathogenesis is needed. Among other genes, mutated PTEN-induced kinase 1 (PINK1) has been linked to early-onset and sporadic PD, but its mode of action is poorly understood. Most animal models of PD are based on the use of the neurotoxin 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP). MPTP is metabolized to MPP+ by monoamine oxidase B (MAO B) and causes cell death of dopaminergic neurons in the substantia nigra in mammals. Zebrafish has been a widely used model organism in developmental biology, but is now emerging as a model for human diseases due to its ideal combination of properties. Zebrafish are inexpensive and easy to maintain, develop rapidly, breed in large quantities producing transparent embryos, and are readily manipulated by various methods, particularly genetic ones. In addition, zebrafish are vertebrate animals and results derived from zebrafish may be more applicable to mammals than results from invertebrate genetic models such as Drosophila melanogaster and Caenorhabditis elegans. However, the similarity cannot be taken for granted. The aim of this study was to establish and test a PD model using larval zebrafish. The developing monoaminergic neuronal systems of larval zebrafish were investigated. We identified and classified 17 catecholaminergic and 9 serotonergic neuron populations in the zebrafish brain. A 3-dimensional atlas was created to facilitate future research. Only one gene encoding MAO was found in the zebrafish genome. Zebrafish MAO showed MAO A-type substrate specificity, but non-A-non-B inhibitor specificity. Distribution of MAO in larval and adult zebrafish brains was both diffuse and distinctly cellular. Inhibition of MAO during larval development led to markedly elevated 5-hydroxytryptamine (serotonin, 5-HT) levels, which decreased the locomotion of the fish. MPTP exposure caused a transient loss of cells in specific aminergic cell populations and decreased locomotion. MPTP-induced changes could be rescued by the MAO B inhibitor deprenyl, suggesting a role for MAO in MPTP toxicity. MPP+ affected only one catecholaminergic cell population; thus, the action of MPP+ was more selective than that of MPTP. The zebrafish PINK1 gene was cloned in zebrafish, and morpholino oligonucleotides were used to suppress its expression in larval zebrafish. The functional domains and expression pattern of zebrafish PINK1 resembled those of other vertebrates, suggesting that zebrafish is a feasible model for studying PINK1. Translation inhibition resulted in cell loss of the same catecholaminergic cell populations as MPTP and MPP+. Inactivation of PINK1 sensitized larval zebrafish to subefficacious doses of MPTP, causing a decrease in locomotion and cell loss in one dopaminergic cell population. Zebrafish appears to be a feasible model for studying PD, since its aminergic systems, mode of action of MPTP, and functions of PINK1 resemble those of mammalians. However, the functions of zebrafish MAO differ from the two forms of MAO found in mammals. Future studies using zebrafish PD models should utilize the advantages specific to zebrafish, such as the ability to execute large-scale genetic or drug screens.
Resumo:
Neuronal ceroid lipofuscinoses (NCLs) are a family of inherited pediatric neurodegenerative disorders, leading to retinal degeneration, death of selective neuronal populations and accumulation of autofluorscent ceroid-lipopigments. The clinical manifestations are generally similar in all forms. The Finnish variant late infantile neuronal ceroid lipofuscinosis (vLINCLFin) is a form of NCL, especially enriched in the Finnish population. The aim of this thesis was to analyse the brain pathology of vLINCLFin utilising the novel Cln5-/- mouse model. Gene expression profiling of the brains of already symptomatic Cln5-/- mice revealed that inflammation, neurodegeneration and defects in myelinization are the major characteristics of the later stages of the disease. Histological characterization of the brain pathology confirmed that the thalamocortical system is affected in Cln5-/- mice, similarly to the other NCL mouse models. However, whereas the brain pathology in all other analyzed NCL mice initiate in the thalamus and spread only months later to the cortex, we observed that the sequence of events is uniquely reversed in Cln5-/- mice; beginning in the cortex and spreading to the thalamus only months later. We could also show that even though neurodegeneration is inititated in the cortex, reactive gliosis and loss of myelin are evident in specific nuclei of the thalamus already in the 1 month old brain. To obtain a deeper insight into the disturbed metabolic pathways, we performed gene expression profiling of presymptomatic mouse brains. We validated these findings with immunohistological analyses, and could show that cytoskeleton and myelin were affected in Cln5-/- mice. Comparison of gene expression profiling results of Cln5-/- and Cln1-/- mice, further highlighted that these two NCL models share a common defective pathway, leading to disturbances in the neuronal growth cone and cytoskeleton. Encouraged by the evidence of this defected pathway, we analyzed the molecular interactions of NCL-proteins and observed that Cln5 and Cln1/Ppt1 proteins interact with each other. Furthermore, we demonstrated that Cln5 and Cln1/Ppt1 share an interaction partner, the F1-ATP synthase, potentially linking both vLINCLFIN and INCL diseases to disturbed lipid metabolism. In addition, Cln5 was shown to interact with other NCL proteins; Cln2, Cln3, Cln6 and Cln8, implicating a central role for Cln5 in the NCL pathophysiology. This study is the first to describe the brain pathology and gene expression changes in the Cln5-/- mouse. Together the findings presented in this thesis represent novel information of the disease processes and the molecular mechanisms behind vLINCLFin and have highlighted that vLINCLFin forms a very important model to analyze the pathophysiology of NCL diseases.
Resumo:
There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.
Resumo:
The information that the economic agents have and regard relevant to their decision making is often assumed to be exogenous in economics. It is assumed that the agents either poses or can observe the payoff relevant information without having to exert any effort to acquire it. In this thesis we relax the assumption of ex-ante fixed information structure and study what happens to the equilibrium behavior when the agents must also decide what information to acquire and when to acquire it. This thesis addresses this question in the two essays on herding and two essays on auction theory. In the first two essays, that are joint work with Klaus Kultti, we study herding models where it is costly to acquire information on the actions that the preceding agents have taken. In our model the agents have to decide both the action that they take and additionally the information that they want to acquire by observing their predecessors. We characterize the equilibrium behavior when the decision to observe preceding agents' actions is endogenous and show how the equilibrium outcome may differ from the standard model, where all preceding agents actions are assumed to be observable. In the latter part of this thesis we study two dynamic auctions: the English and the Dutch auction. We consider a situation where bidder(s) are uninformed about their valuations for the object that is put up for sale and they may acquire this information for a small cost at any point during the auction. We study the case of independent private valuations. In the third essay of the thesis we characterize the equilibrium behavior in an English auction when there are informed and uninformed bidders. We show that the informed bidder may jump bid and signal to the uninformed that he has a high valuation, thus deterring the uninformed from acquiring information and staying in the auction. The uninformed optimally acquires information once the price has passed a particular threshold and the informed has not signalled that his valuation is high. In addition, we provide an example of an information structure where the informed bidder initially waits and then makes multiple jumps. In the fourth essay of this thesis we study the Dutch auction. We consider two cases where all bidders are all initially uninformed. In the first case the information acquisition cost is the same across all bidders and in the second also the cost of information acquisition is independently distributed and private information to the bidders. We characterize a mixed strategy equilibrium in the first and a pure strategy equilibrium in the second case. In addition we provide a conjecture of an equilibrium in an asymmetric situation where there is one informed and one uninformed bidder. We compare the revenues that the first price auction and the Dutch auction generate and we find that under some circumstances the Dutch auction outperforms the first price sealed bid auction. The usual first price sealed bid auction and the Dutch auction are strategically equivalent. However, this equivalence breaks down in case information is acquired during the auction.
Resumo:
We present a search for standard model Higgs boson production in association with a W boson in proton-antiproton collisions at a center of mass energy of 1.96 TeV. The search employs data collected with the CDF II detector that correspond to an integrated luminosity of approximately 1.9 inverse fb. We select events consistent with a signature of a single charged lepton, missing transverse energy, and two jets. Jets corresponding to bottom quarks are identified with a secondary vertex tagging method, a jet probability tagging method, and a neural network filter. We use kinematic information in an artificial neural network to improve discrimination between signal and background compared to previous analyses. The observed number of events and the neural network output distributions are consistent with the standard model background expectations, and we set 95% confidence level upper limits on the production cross section times branching fraction ranging from 1.2 to 1.1 pb or 7.5 to 102 times the standard model expectation for Higgs boson masses from 110 to $150 GeV/c^2, respectively.
Resumo:
The ProFacil model is a generic process model defined as a framework model showing the links between the facilities management process and the building end user’s business process. The purpose of using the model is to support more detailed process modelling. The model has been developed using the IDEF0 modelling method. The ProFacil model describes business activities from the generalized point of view as management-, support-, and core processes and their relations. The model defines basic activities in the provision of a facility. Examples of these activities are “operate facilities”, “provide new facilities”, “provide re-build facilities”, “provide maintained facilities” and “perform dispose of facilities”. These are all generic activities providing a basis for a further specialisation of company specific FM activities and their tasks. A facilitator can establish a specialized process model using the ProFacil model and interacting with company experts to describe their company’s specific processes. These modelling seminars or interviews will be done in an informal way, supported by the high-level process model as a common reference.
Resumo:
The goal of the single building information model has existed for at least thirty years and various standards have been published leading up to the ten-year development of the Industry Foundation Classes. These have been initiatives from researchers, software developers and standards committees. Now large property owners are becoming aware of the benefits of moving IT tools from specific applications towards more comprehensive solutions. This study addresses the state of Building Information Models and the conditions necessary for them to become more widely used. It is a qualitative study based on information from a number of international experts and has asked a series of questions about the feasibility of BIMs, the conditions necessary for their success, and the role of standards with particular reference to the IFCs. Some key statements were distilled from the diverse answers received and indicate that BIM solutions appear too complex for many and may need to be applied in limited areas initially. Standards are generally supported but not applied rigorously and a range of these are relevant to BIM. Benefits will depend upon the building procurement methods used and there should be special roles within the project team to manage information. Case studies are starting to appear and these could be used for publicity. The IFCs are rather oversold and their complexities should be hidden within simple-to-use software. Inevitably major questions remain and property owners may be the key to answering some of these. A framework for presenting standards, backed up by case studies of successful projects, is the solution proposed to provide better information on where particular BIM standards and solutions should be applied in building projects.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
This article discusses the scope of research on the application of information technology in construction (ITC). A model of the information and material activities which together constitute the construction process is presented, using the IDEF0 activity modelling methodology. Information technology is defined to include all kinds of technology used for the storage, transfer and manipulation of information, thus also including devices such as copying machines, faxes and mobile phones. Using the model the domain of ITC research is defined as the use of information technology to facilitate and re-engineer the information process component of construction. Developments during the last decades in IT use in construction is discussed against a background of a simplified model of generic information processing tasks. The scope of ITC is compared with the scopes of research in related areas such as design methodology, construction management and facilities management. Health care is proposed as an interesting alternative (to the often used car manufacturing industry), as an IT application domain to compare with. Some of the key areas of ITC research in recent years; expert systems, company IT strategies, and product modelling are shortly discussed. The article finishes with a short discussion of the problems of applying standard scientific methodology in ITC research, in particular in product model research.
Resumo:
A functioning stock market is an essential component of a competitive economy, since it provides a mechanism for allocating the economy’s capital stock. In an ideal situation, the stock market will steer capital in a manner that maximizes the total utility of the economy. As prices of traded stocks depend on and vary with information available to investors, it is apparent that information plays a crucial role in a functioning stock market. However, even though information indisputably matters, several issues regarding how stock markets process and react to new information still remain unanswered. The purpose of this thesis is to explore the link between new information and stock market reactions. The first essay utilizes new methodological tools in order to investigate the average reaction of investors to new financial statement information. The second essay explores the behavior of different types of investors when new financial statement information is disclosed to the market. The third essay looks into the interrelation between investor size, behavior and overconfidence. The fourth essay approaches the puzzle of negative skewness in stock returns from an altogether different angle than previous studies. The first essay presents evidence of the second derivatives of some financial statement signals containing more information than the first derivatives. Further, empirical evidence also indicates that some of the investigated signals proxy risk while others contain information priced with a delay. The second essay documents different categories of investors demonstrating systematical differences in their behavior when new financial statement information arrives to the market. In addition, a theoretical model building on differences in investor overconfidence is put forward in order to explain the observed behavior. The third essay shows that investor size describes investor behavior very well. This finding is predicted by the model proposed in the second essay, and hence strengthens the model. The behavioral differences between investors of different size furthermore have significant economic implications. Finally, the fourth essay finds strong evidence of management news disclosure practices causing negative skewness in stock returns.