933 resultados para Professional of information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As an emerging research method that has showed promising potential in several research disciplines, simulation received relatively few attention in information systems research. This paper illustrates a framework for employing simulation to study IT value cocreation. Although previous studies identified factors driving IT value cocreation, its underlying process remains unclear. Simulation can address this limitation through exploring such underlying process with computational experiments. The simulation framework in this paper is based on an extended NK model. Agent-based modeling is employed as the theoretical basis for the NK model extensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seismic hazard and microzonation of cities enable to characterize the potential seismic areas that need to be taken into account when designing new structures or retrofitting the existing ones. Study of seismic hazard and preparation of geotechnical microzonation maps has been attempted using Geographical Information System (GIS). GIS will provide an effective solution for integrating different layers of information thus providing a useful input for city planning and in particular input to earthquake resistant design of structures in an area. Seismic hazard is the study of expected earthquake ground motions at any point on the earth. Microzonation is the process of sub division of region in to number of zones based on the earthquake effects in the local scale. Seismic microzonation is the process of estimating response of soil layers under earthquake excitation and thus the variation of ground motion characteristic on the ground surface. For the seismic microzonation, geotechnical site characterization need to be assessed at local scale (micro level), which is further used to assess of the site response and liquefaction susceptibility of the sites. Seismotectonic atlas of the area having a radius of 350km around Bangalore has been prepared with all the seismogenic sources and historic earthquake events (a catalogue of about 1400 events since 1906). We have attempted to carryout the site characterization of Bangalore by collating conventional geotechnical boreholes data (about 900 borehole data with depth) and integrated in GIS. 3-D subsurface model of Bangalore prepared using GIS is shown in Figure 1.Further, Shear wave velocity survey based on geophysical method at about 60 locations in the city has been carried out in 220 square Kms area. Site response and local site effects have been evaluated using 1-dimensional ground response analysis. Spatial variability of soil overburden depths, ground surface Peak Ground Acceleration’s(PGA), spectral acceleration for different frequencies, liquefaction susceptibility have been mapped in the 220 sq km area using GIS.ArcInfo software has been used for this purpose. These maps can be used for the city planning and risk & vulnerability studies. Figure 2 shows a map of peak ground acceleration at rock level for Bangalore city. Microtremor experiments were jointly carried out with NGRI scientists at about 55 locations in the city and the predominant frequency of the overburden soil columns were evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sensory receptors determine the type and the quantity of information available for perception. Here, we quantified and characterized the information transferred by primary afferents in the rat whisker system using neural system identification. Quantification of ``how much'' information is conveyed by primary afferents, using the direct method (DM), a classical information theoretic tool, revealed that primary afferents transfer huge amounts of information (up to 529 bits/s). Information theoretic analysis of instantaneous spike-triggered kinematic stimulus features was used to gain functional insight on ``what'' is coded by primary afferents. Amongst the kinematic variables tested-position, velocity, and acceleration-primary afferent spikes encoded velocity best. The other two variables contributed to information transfer, but only if combined with velocity. We further revealed three additional characteristics that play a role in information transfer by primary afferents. Firstly, primary afferent spikes show preference for well separated multiple stimuli (i.e., well separated sets of combinations of the three instantaneous kinematic variables). Secondly, neurons are sensitive to short strips of the stimulus trajectory (up to 10 ms pre-spike time), and thirdly, they show spike patterns (precise doublet and triplet spiking). In order to deal with these complexities, we used a flexible probabilistic neuron model fitting mixtures of Gaussians to the spike triggered stimulus distributions, which quantitatively captured the contribution of the mentioned features and allowed us to achieve a full functional analysis of the total information rate indicated by the DM. We found that instantaneous position, velocity, and acceleration explained about 50% of the total information rate. Adding a 10 ms pre-spike interval of stimulus trajectory achieved 80-90%. The final 10-20% were found to be due to non-linear coding by spike bursts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information is encoded in neural circuits using both graded and action potentials, converting between them within single neurons and successive processing layers. This conversion is accompanied by information loss and a drop in energy efficiency. We investigate the biophysical causes of this loss of information and efficiency by comparing spiking neuron models, containing stochastic voltage-gated Na+ and K+ channels, with generator potential and graded potential models lacking voltage-gated Na+ channels. We identify three causes of information loss in the generator potential that are the by-product of action potential generation: (1) the voltage-gated Na+ channels necessary for action potential generation increase intrinsic noise and (2) introduce non-linearities, and (3) the finite duration of the action potential creates a `footprint' in the generator potential that obscures incoming signals. These three processes reduce information rates by similar to 50% in generator potentials, to similar to 3 times that of spike trains. Both generator potentials and graded potentials consume almost an order of magnitude less energy per second than spike trains. Because of the lower information rates of generator potentials they are substantially less energy efficient than graded potentials. However, both are an order of magnitude more efficient than spike trains due to the higher energy costs and low information content of spikes, emphasizing that there is a two-fold cost of converting analogue to digital; information loss and cost inflation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper viewed the decline in information provision in Nigeria to poor library development, which could be attributed to poor funding. The consequence is that current journal and books are not available in nigerian fisheries libraries. Information which can be regarded as the first factor of production on which other factors like land, labour and capital depend, can only be provided at the right time when libraries are better founded. For now if there must be increase in fish production, poverty alleviation and food security in Nigeria, our fisheries scientists and policy makers will have to rely on international sources of information using the advantage of internet connectivity. Some of such sources discussed in this paper are ASFA, AGORA, FAO DOAJ, FISHBASE, IAMSLIC, INASP, INASP-PERI, INASP-AJOL, ODINAFRICA, SIFAR, WAS, and ABASFR. However, reliance on international sources must not be at the total neglect of harnessing nigerian fisheries information. For the Nigerian Fisheries and Aquatic Sciences Database being developed by NIFFR to attain an international status like those enumerated above, scientists and publishers are requested to take the pain of depositing copies of their publications with NIFFR for inclusion in the Database

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several patients of P. J. Vogel who had undergone cerebral commissurotomy for the control of intractable epilepsy were tested on a variety of tasks to measure aspects of cerebral organization concerned with lateralization in hemispheric function. From tests involving identification of shapes it was inferred that in the absence of the neocortical commissures, the left hemisphere still has access to certain types of information from the ipsilateral field. The major hemisphere can still make crude differentiations between various left-field stimuli, but is unable to specify exact stimulus properties. Most of the time the major hemisphere, having access to some ipsilateral stimuli, dominated the minor hemisphere in control of the body.

Competition for control of the body between the hemispheres is seen most clearly in tests of minor hemisphere language competency, in which it was determined that though the minor hemisphere does possess some minimal ability to express language, the major hemisphere prevented its expression much of the time. The right hemisphere was superior to the left in tests of perceptual visualization, and the two hemispheres appeared to use different strategies in attempting to solve the problems, namely, analysis for the left hemisphere and synthesis for the right hemisphere.

Analysis of the patients' verbal and performance I.Q.'s, as well as observations made throughout testing, suggest that the corpus callosum plays a critical role in activities that involve functions in which the minor hemisphere normally excels, that the motor expression of these functions may normally come through the major hemisphere by way of the corpus callosum.

Lateral specialization is thought to be an evolutionary adaptation which overcame problems of a functional antagonism between the abilities normally associated with the two hemispheres. The tests of perception suggested that this function lateralized into the mute hemisphere because of an active counteraction by language. This latter idea was confirmed by the finding that left-handers, in whom there is likely to be bilateral language centers, are greatly deficient on tests of perception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stichaeidae, commonly referred to as pricklebacks, are intertidal and subtidal fishes primarily of the North Pacific Ocean. Broad distribution in relatively inaccessible and undersampled habitats has contributed to a general lack of information about this family. In this study, descriptions of early life history stages are presented for 25 species representing 18 genera of stichaeid fishes from the northeastern Pacific Ocean, Bering Sea, and Arctic Ocean Basin. Six of these species also occur in the North Atlantic Ocean. Larval stages of 16 species are described for the first time. Additional information or illustrations intended to augment previous descriptions are provided for nine species. For most taxa, we present adult and larval distributions, descriptions of morphometric, meristic, and pigmentation characters, and species comparisons, and we provide illustrations for preflexion through postflexion or transformation stages. New counts of meristic features are reported for several species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first dedicated collections of deep-water (>80 m) sponges from the central Aleutian Islands revealed a rich fauna including 28 novel species and geographical range extensions for 53 others. Based on these collections and the published literature, we now confirm the presence of 125 species (or subspecies)of deep-water sponges in the Aleutian Islands. Clearly the deep-water sponge fauna of the Aleutian Islands is extraordinarily rich and largely understudied. Submersible observations revealed that sponges, rather than deep-water corals, are the dominant feature shaping benthic habitats in the region and that they provide important refuge habitat for many species of fish and invertebrates including juvenile rockfish (Sebastes spp.) and king crabs (Lithodes sp). Examination of video footage collected along 127 km of the seafloor further indicate that there are likely hundreds of species still uncollected from the region, and many unknown to science. Furthermore, sponges are extremely fragile and easily damaged by contact with fishing gear. High rates of fishery bycatch clearly indicate a strong interaction between existing fisheries and sponge habitat. Bycatch in fisheries and fisheries-independent surveys can be a major source of information on the location of the sponge fauna, but current monitoring programs are greatly hampered by the inability of deck personnel to identify bycatch. This guide contains detailed species descriptions for 112 sponges collected in Alaska, principally in the central Aleutian Islands. It addresses bycatch identification challenges by providing fisheries observers and scientists with the information necessary to adequately identify sponge fauna. Using that identification data, areas of high abundance can be mapped and the locations of indicator species of vulnerable marine ecosystems can be determined. The guide is also designed for use by scientists making observations of the fauna in situ with submersibles, including remotely operated vehicles and autonomous underwater vehicles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Internet of Things (IOT) concept and enabling technologies such as RFID offer the prospect of linking the real world of physical objects with the virtual world of information technology to improve visibility and traceability information within supply chains and across the entire lifecycles of products, as well as enabling more intuitive interactions and greater automation possibilities. There is a huge potential for savings through process optimization and profit generation within the IOT, but the sharing of financial benefits across companies remains an unsolved issue. Existing approaches towards sharing of costs and benefits have failed to scale so far. The integration of payment solutions into the IOT architecture could solve this problem. We have reviewed different possible levels of integration. Multiple payment solutions have been researched. Finally we have developed a model that meets the requirements of the IOT in relation to openness and scalability. It supports both hardware-centric and software-centric approaches to integration of payment solutions with the IOT. Different requirements concerning payment solutions within the IOT have been defined and considered in the proposed model. Possible solution providers include telcos, e-payment service providers and new players such as banks and standardization bodies. The proposed model of integrating the Internet of Things with payment solutions will lower the barrier to invoicing for the more granular visibility information generated using the IOT. Thus, it has the potential to enable recovery of the necessary investments in IOT infrastructure and accelerate adoption of the IOT, especially for projects that are only viable when multiple benefits throughout the supply chain need to be accumulated in order to achieve a Return on Investment (ROI). In a long-term perspective, it may enable IT-departments to become profit centres instead of cost centres. © 2010 - IOS Press and the authors. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Engineering change is a significant part of any product development programme. Changes can arise at many points throughout the product life-cycle, resulting in rework which can ripple through different stages of the design process. Managing change processes is thus a critical aspect of any design project, especially in complex design. Through a literature review, this paper shows the diversity of information models used by different change management methods proposed in the literature. A classification framework for organising these change management approaches is presented. The review shows an increase in the number of cross-domain models proposed to help manage changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is concerned with the role of information in the servitization of manufacturing which has led to “the innovation of an organisation’s capabilities and processes as equipment manufacturers seek to offer services around their products” (Neely 2009, Baines et al 2009). This evolution has resulted in an information requirement (IR) shift as companies move from discrete provision of equipment and spare parts to long-term service contracts guaranteeing prescribed performance levels. Organisations providing such services depend on a very high level of availability and quality of information throughout the service life-cycle (Menor et al 2002). This work focuses on whether, for a proposed contract based around complex equipment, the Information System is capable of providing information at an acceptable quality and requires the IRs to be examined in a formal manner. We apply a service information framework (Cuthbert et al 2008, McFarlane & Cuthbert 2012) to methodically assess IRs for different contract types to understand the information gap between them. Results from case examples indicate that this gap includes information required for the different contract types and a set of contract-specific IRs. Furthermore, the control, ownership and use of information differs across contract types as the boundary of operation and responsibility changes.