987 resultados para Shared Information


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider bipartitions of one-dimensional extended systems whose probability distribution functions describe stationary states of stochastic models. We define estimators of the information shared between the two subsystems. If the correlation length is finite, the estimators stay finite for large system sizes. If the correlation length diverges, so do the estimators. The definition of the estimators is inspired by information theory. We look at several models and compare the behaviors of the estimators in the finite-size scaling limit. Analytical and numerical methods as well as Monte Carlo simulations are used. We show how the finite-size scaling functions change for various phase transitions, including the case where one has conformal invariance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Using the wisdom of crowds---combining many individual forecasts to obtain an aggregate estimate---can be an effective technique for improving forecast accuracy. When individual forecasts are drawn from independent and identical information sources, a simple average provides the optimal crowd forecast. However, correlated forecast errors greatly limit the ability of the wisdom of crowds to recover the truth. In practice, this dependence often emerges because information is shared: forecasters may to a large extent draw on the same data when formulating their responses.

To address this problem, I propose an elicitation procedure in which each respondent is asked to provide both their own best forecast and a guess of the average forecast that will be given by all other respondents. I study optimal responses in a stylized information setting and develop an aggregation method, called pivoting, which separates individual forecasts into shared and private information and then recombines these results in the optimal manner. I develop a tailored pivoting procedure for each of three information models, and introduce a simple and robust variant that outperforms the simple average across a variety of settings.

In three experiments, I investigate the method and the accuracy of the crowd forecasts. In the first study, I vary the shared and private information in a controlled environment, while the latter two studies examine forecasts in real-world contexts. Overall, the data suggest that a simple minimal pivoting procedure provides an effective aggregation technique that can significantly outperform the crowd average.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Information in construction industry is delivered and interpreted in a language specific to the industry in which large complex objects are only partially described and with much information being implicit in the language used. Successful communication therefore relies on participants in the industry leaming how to interpret the language through many years of education, training and experience. With the introduction of computer technology, and in particular the detailed digital building information model (DB 1M), the accepted language currently in use is no longer a valid method of describing the building. At all stages in the paper based design and documentation process it is generally readily apparent which parts of the design require further completion and which are fully resolved. This is able to be achieved through the complex graphical language currently in use. In the DBIM, all information appears at the same level of resolution making difficult the interpretation of implicit information embedded in the model. This compromises the collaborative design environment which is being described as a fundamental characteristic of the future construction industry. This paper focuses on two areas. The first analyses design resolution and the role uncertain information plays in the design process. It then discusses the manner in which designers and the industry in general deal with incomplete or unresolved information. The second describes a theoretical model in which a design resolution (DR) environment incorporates the level of design resolution as an operable element in a collaborative DBIM. The development and implementation of this model will allow designers to better share, understand and interpret design knowledge from the shared information during the various stages of digital design and before full resolution is achieved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Alvin Toffler’s image of the prosumer (1970, 1980, 1990) continues to influence in a significant way our understanding of the user-led, collaborative processes of content creation which are today labelled “social media” or “Web 2.0”. A closer look at Toffler’s own description of his prosumer model reveals, however, that it remains firmly grounded in the mass media age: the prosumer is clearly not the self-motivated creative originator and developer of new content which can today be observed in projects ranging from open source software through Wikipedia to Second Life, but simply a particularly well-informed, and therefore both particularly critical and particularly active, consumer. The highly specialised, high end consumers which exist in areas such as hi-fi or car culture are far more representative of the ideal prosumer than the participants in non-commercial (or as yet non-commercial) collaborative projects. And to expect Toffler’s 1970s model of the prosumer to describe these 21st-century phenomena was always an unrealistic expectation, of course. To describe the creative and collaborative participation which today characterises user-led projects such as Wikipedia, terms such as ‘production’ and ‘consumption’ are no longer particularly useful – even in laboured constructions such as ‘commons-based peer-production’ (Benkler 2006) or ‘p2p production’ (Bauwens 2005). In the user communities participating in such forms of content creation, roles as consumers and users have long begun to be inextricably interwoven with those as producer and creator: users are always already also able to be producers of the shared information collection, regardless of whether they are aware of that fact – they have taken on a new, hybrid role which may be best described as that of a produser (Bruns 2008). Projects which build on such produsage can be found in areas from open source software development through citizen journalism to Wikipedia, and beyond this also in multi-user online computer games, filesharing, and even in communities collaborating on the design of material goods. While addressing a range of different challenges, they nonetheless build on a small number of universal key principles. This paper documents these principles and indicates the possible implications of this transition from production and prosumption to produsage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The second of the Hermelin Brain Tumor Center Symposia was held once again at Henry Ford Hospital in Detroit, Michigan on October 24th and 25th, 2003. A public conference was held on the 24th while a closed-door session took place on the 25th. The purpose of these symposia is to bring together experts in a particular field of study with the aim to share information with each other and the public, but then to meet privately to present novel data, hold discussions, and share concepts. While the interaction is intended to benefit all involved, the incentive is the expectation that the shared information will aid researchers at the Hermelin Brain Tumor Center in their quest to identify potential therapeutic targets and explore translational therapeutic strategies for the treatment of patients suffering nervous system tumors...

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A three day workshop on turbidity measurements was held at the Hawaii Institute of Marine Biology from August 3 1 to September 2, 2005. The workshop was attended by 30 participants from industry, coastal management agencies, and academic institutions. All groups recognized common issues regarding the definition of turbidity, limitations of consistent calibration, and the large variety of instrumentation that nominally measure "turbidity." The major recommendations, in order of importance for the coastal monitoring community are listed below: 1. The community of users in coastal ecosystems should tighten instrument design configurations to minimize inter-instrument variability, choosing a set of specifications that are best suited for coastal waters. The IS0 7027 design standard is not tight enough. Advice on these design criteria should be solicited through the ASTM as well as Federal and State regulatory agencies representing the majority of turbidity sensor end users. Parties interested in making turbidity measurements in coastal waters should develop design specifications for these water types rather than relying on design standards made for the analysis of drinking water. 2. The coastal observing groups should assemble a community database relating output of specific sensors to different environmental parameters, so that the entire community of users can benefit from shared information. This would include an unbiased, parallel study of different turbidity sensors, employing a variety of designs and configuration in the broadest range of coastal environments. 3. Turbidity should be used as a measure of relative change in water quality rather than an absolute measure of water quality. Thus, this is a recommendation for managers to develop their own local calibrations. See next recommendation. 4. If the end user specifically wants to use a turbidity sensor to measure a specific water quality parameter such as suspended particle concentration, then direct measurement of that water quality parameter is necessary to correlate with 'turbidity1 for a particular environment. These correlations, however, will be specific to the environment in which they are measured. This works because there are many environments in which water composition is relatively stable but varies in magnitude or concentration. (pdf contains 22 pages)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dans son texte, l’auteur tente de répondre à quatre grandes questions : le cadre juridique actuel qui régi la gestion des renseignements personnels dans le réseau de la santé et des services sociaux protège–t–il adéquatement ces données sensibles ? Peut–on tenter d’améliorer cette protection ? Quels facteurs contribuent à l’émergence des nouvelles normes ? Quel rôle les exigences techniques, les intérêts économiques et les considérations politiques jouent–ils dans l’émergence de ces nouvelles normes ? À l’égard de ces interrogations, l’auteur montre que le régime actuel, qui se distingue particulièrement par sa complexité et son opacité, ne protège pas adéquatement les renseignements personnels appelés à circuler dans l’univers numérique en émergence. Il examine donc les facteurs qui contribuent À l’émergence des nouvelles normes alors que les règles juridiques actuelles se trouvent en porte–à–faux par rapport aux développements technologiques. Par la suite, il tente de situer ou de mesurer l’influence des intérêts économiques, des considérations politiques et des impératifs techniques dans l’émergence de ces nouvelles normes. Finalement, il met de l’avant un concept nouveau, celui des aires de partage, qu’il présente comme un moyen de mieux protéger les renseignements personnels et confidentiels dans l’ère de l’information.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La optimización y armonización son factores clave para tener un buen desempeño en la industria química. BASF ha desarrollado un proyecto llamada acelerador. El objetivo de este proyecto ha sido la armonización y la integración de los procesos de la cadena de suministro a nivel mundial. El proceso básico de manejo de inventarios se quedó fuera del proyecto y debía ser analizado. El departamento de manejo de inventarios en BASF SE ha estado desarrollando su propia estrategia para la definición de procesos globales de manufactura. En este trabajo se presentará un informe de las fases de la formulación de la estrategia y establecer algunas pautas para la fase de implementación que está teniendo lugar en 2012 y 2013.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Las tecnologías de la información han empezado a ser un factor importante a tener en cuenta en cada uno de los procesos que se llevan a cabo en la cadena de suministro. Su implementación y correcto uso otorgan a las empresas ventajas que favorecen el desempeño operacional a lo largo de la cadena. El desarrollo y aplicación de software han contribuido a la integración de los diferentes miembros de la cadena, de tal forma que desde los proveedores hasta el cliente final, perciben beneficios en las variables de desempeño operacional y nivel de satisfacción respectivamente. Por otra parte es importante considerar que su implementación no siempre presenta resultados positivos, por el contrario dicho proceso de implementación puede verse afectado seriamente por barreras que impiden maximizar los beneficios que otorgan las TIC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wholesale and Retail place LLC, es una empresa que nace en el año 2012 en la ciudad de Carteret, Nueva Jersey dedicada a la comercialización y distribución de ropa para mujer colombiana en Estados Unidos. De esta manera, la problemática abordada en este proyecto será la influencia de la política de inventarios en el desempeño de la empresa. A partir de esto, se plantea realizar un mejoramiento de la política de gestión de inventarios, recurso que procure cambiar el modelo operativo de negocio mediante un enfoque a la transición, desde una pequeña hasta una mediana empresa, con énfasis en la perdurabilidad y sostenibilidad de la compañía.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La gestión de inventarios es uno de los grandes retos que afrontan las empresas hoy en día, especialmente aquellas que manipulan productos con altas probabilidades de daños y averías, razón por la cual definir políticas o un sistema de gestión de inventarios garantizaría una perdurabilidad y sostenibilidad en el mercado más prolongada. Es por ello que el proyecto planteado a continuación desarrollado en la empresa Diageo, multinacional de consumo masivo del sector licores, está inscrito en la línea de investigación de Gerencia de la Universidad del Rosario. De esta manera, bajo el programa de Áreas funcionales para la dirección que cuenta con un enfoque en perdurabilidad empresarial, la línea de investigación en Gerencia busca generar conocimientos sobre finanzas, mercadeo, operaciones y gestión humana. Por lo anterior, partiendo de la premisa de que una empresa perdurable es aquella que “adecúa su manejo a la intensidad de las condiciones del entorno sectorial y las fuerzas del mercado” (Leal, Guerrero, Rojas, & Rivera, 2011), se hace necesario orientar los recursos y esfuerzos de la empresa hacia una nueva política de inventarios en el portafolio de vinos, de modo que al incrementar el nivel de servicio se afecten positivamente indicadores de rentabilidad y liquidez.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Effective knowledge sharing underpins the day-to-day work activities in knowledge-intensive organizational environments. This paper integrates key concepts from the literature towards a model to explain effective knowledge sharing in such environments. It is proposed that the effectiveness of knowledge sharing is determined by the maturity of informal and formal social networks and a shared information and knowledge-based artefact network (AN) in a particular work context. It is further proposed that facilitating mechanisms within the social and ANs, and mechanisms that link these networks, affect the overall efficiency of knowledge sharing in complex environments. Three case studies are used to illustrate the model, highlighting typical knowledge-sharing problems that result when certain model elements are absent or insufficient in a particular environment. The model is discussed in terms of diagnosing knowledge-sharing problems, organizational knowledge strategy, and the role of information and communication technology in knowledge sharing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Face recognition with multiple views is a challenging research problem. Most of the existing works have focused on extracting shared information among multiple views to improve recognition. However, when the pose variation is too large or missing, 'shared information' may not be properly extracted, leading to poor recognition results. In this paper, we propose a novel method for face recognition with multiple view images to overcome the large pose variation and missing pose issue. By introducing a novel mixed norm, the proposed method automatically selects candidates from the gallery to best represent a group of highly correlated face images in a query set to improve classification accuracy. This mixed norm combines the advantages of both sparse representation based classification (SRC) and joint sparse representation based classification (JSRC). A trade off between the ℓ1-norm from SRC and ℓ2,1-norm from JSRC is introduced to achieve this goal. Due to this property, the proposed method decreases the influence when a face image is unseen and has large pose variation in the recognition process. And when some face images with a certain degree of unseen pose variation appear, this mixed norm will find an optimal representation for these query images based on the shared information induced from multiple views. Moreover, we also address an open problem in robust sparse representation and classification which is using ℓ1-norm on the loss function to achieve a robust solution. To solve this formulation, we derive a simple, yet provably convergent algorithm based on the powerful alternative directions method of multipliers (ADMM) framework. We provide extensive comparisons which demonstrate that our method outperforms other state-of-the-arts algorithms on CMU-PIE, Yale B and Multi-PIE databases for multi-view face recognition.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The structure of the worlds we know is built on habits and is conditioned by fixed beliefs through which we filter and conform a circumscribed universe. This building process makes it essential to understand the paths of the Information capture and recontextualization as well as to elucidate the involvement of principles or laws that regulate and structure the ways we think and act creatively in contemporary times. The proposal of this article is to point out that Information Science, while studying the set of changes related to the establishment of new habits of the Information Society, should also provide relevant sociocultural indicators for the understanding of our historical moment. In this realm, it presents few extracted moments from the context of these changes that regard to the continuum of the shared information and knowledge desires. The procedure requires a significant retreat of the viewpoint, simultaneously placing such a movement in the scope of relations between habit and break as movements that weave relations of the world, of humankind and of distinct cultural changes.