998 resultados para Mapping communication
Resumo:
The quantitative research conducted on communication in organizations from bauruenses has served as inspiration for: the development of applied research regarding Brazilian organizational reality, to the communication courses and on the improvements in the communication aspect of the organizations in the region. The goal is to provide information related t the role of organizational communication and the public relations role in this process, measuring the empowerment of this community. The mapping methodology is based on structured questionnaires in order to capture ordinary factors, active and retroactive, from the communication processes such as: relationship types, channels and media, degree of interaction and interdependence between the parties; types of public involved in the process and the expectations of the organization in relation to goals, objectives, actions, events and activities programmed in communication. For the delineation of a reflective proposal from the data, we chose to work on analyzing three dimensions: the strategic communication, linked to the business model and information technologies; the internal communication, linked to culture, to the organizational structure and management that implies the new paradigms of relationship; and lastly, the intention to materialize the collected data on indicators and scales from attributes caused or inhibited from the excellent communication, including evaluation of the practice of Public Relations. This article presents the main results related to micro companies in Bauru, São Paulo.
Resumo:
This article presents a methodological proposition to map the diversity of the audiovisual industry in the digital scenario by portraying the most important interactions between those who create, produce, distribute and disseminate audiovisual productions on line, paying special attention to powerful intermediaries and to small and medium independent agents. Taking as a point of departure a flexible understanding of social network analysis, the aim is to understand the structure of the audiovisual industry on the internet so that, taking into consideration a given sector, agents, their relations and the networks they give place to – as well as the structural conditions under which they operate – are studied. The aim is to answer questions such as: what is mapping, what is of interesting to map, how can it be done and what advantages and disadvantages the results will present.
Resumo:
Many-core platforms based on Network-on-Chip (NoC [Benini and De Micheli 2002]) present an emerging technology in the real-time embedded domain. Although the idea to group the applications previously executed on separated single-core devices, and accommodate them on an individual many-core chip offers various options for power savings, cost reductions and contributes to the overall system flexibility, its implementation is a non-trivial task. In this paper we address the issue of application mapping onto a NoCbased many-core platform when considering fundamentals and trends of current many-core operating systems, specifically, we elaborate on a limited migrative application model encompassing a message-passing paradigm as a communication primitive. As the main contribution, we formulate the problem of real-time application mapping, and propose a three-stage process to efficiently solve it. Through analysis it is assured that derived solutions guarantee the fulfilment of posed time constraints regarding worst-case communication latencies, and at the same time provide an environment to perform load balancing for e.g. thermal, energy, fault tolerance or performance reasons.We also propose several constraints regarding the topological structure of the application mapping, as well as the inter- and intra-application communication patterns, which efficiently solve the issues of pessimism and/or intractability when performing the analysis.
Resumo:
Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.
Resumo:
This was a descriptive, retrospective study, with a quantitative method, with the aim of analyzing the nursing diagnoses contained in the records of children of 0 to 36 months of age who attended infant health nursing consults. A documentary analysis and the cross-mapping technique were used. One hundred eighty-eight different nursing diagnoses were encountered, of which 33 (58.9%) corresponded to diagnoses contained in the Nomenclature of Nursing Diagnoses and Interventions and 23 (41.1%) were derived from ICNP® Version 1.0. Of the 56 nursing diagnoses, 43 (76.8%) were considered to be deviations from normalcy. It was concluded that the infant health nursing consults enabled the identification of situations of normalcy and abnormality, with an emphasis on the diagnoses of deviations from normalcy. Standardized language favors nursing documentation, contributing to the care of the patient and facilitating communication between nurses and other health professionals.
Resumo:
It is estimated that around 230 people die each year due to radon (222Rn) exposure in Switzerland. 222Rn occurs mainly in closed environments like buildings and originates primarily from the subjacent ground. Therefore it depends strongly on geology and shows substantial regional variations. Correct identification of these regional variations would lead to substantial reduction of 222Rn exposure of the population based on appropriate construction of new and mitigation of already existing buildings. Prediction of indoor 222Rn concentrations (IRC) and identification of 222Rn prone areas is however difficult since IRC depend on a variety of different variables like building characteristics, meteorology, geology and anthropogenic factors. The present work aims at the development of predictive models and the understanding of IRC in Switzerland, taking into account a maximum of information in order to minimize the prediction uncertainty. The predictive maps will be used as a decision-support tool for 222Rn risk management. The construction of these models is based on different data-driven statistical methods, in combination with geographical information systems (GIS). In a first phase we performed univariate analysis of IRC for different variables, namely the detector type, building category, foundation, year of construction, the average outdoor temperature during measurement, altitude and lithology. All variables showed significant associations to IRC. Buildings constructed after 1900 showed significantly lower IRC compared to earlier constructions. We observed a further drop of IRC after 1970. In addition to that, we found an association of IRC with altitude. With regard to lithology, we observed the lowest IRC in sedimentary rocks (excluding carbonates) and sediments and the highest IRC in the Jura carbonates and igneous rock. The IRC data was systematically analyzed for potential bias due to spatially unbalanced sampling of measurements. In order to facilitate the modeling and the interpretation of the influence of geology on IRC, we developed an algorithm based on k-medoids clustering which permits to define coherent geological classes in terms of IRC. We performed a soil gas 222Rn concentration (SRC) measurement campaign in order to determine the predictive power of SRC with respect to IRC. We found that the use of SRC is limited for IRC prediction. The second part of the project was dedicated to predictive mapping of IRC using models which take into account the multidimensionality of the process of 222Rn entry into buildings. We used kernel regression and ensemble regression tree for this purpose. We could explain up to 33% of the variance of the log transformed IRC all over Switzerland. This is a good performance compared to former attempts of IRC modeling in Switzerland. As predictor variables we considered geographical coordinates, altitude, outdoor temperature, building type, foundation, year of construction and detector type. Ensemble regression trees like random forests allow to determine the role of each IRC predictor in a multidimensional setting. We found spatial information like geology, altitude and coordinates to have stronger influences on IRC than building related variables like foundation type, building type and year of construction. Based on kernel estimation we developed an approach to determine the local probability of IRC to exceed 300 Bq/m3. In addition to that we developed a confidence index in order to provide an estimate of uncertainty of the map. All methods allow an easy creation of tailor-made maps for different building characteristics. Our work is an essential step towards a 222Rn risk assessment which accounts at the same time for different architectural situations as well as geological and geographical conditions. For the communication of 222Rn hazard to the population we recommend to make use of the probability map based on kernel estimation. The communication of 222Rn hazard could for example be implemented via a web interface where the users specify the characteristics and coordinates of their home in order to obtain the probability to be above a given IRC with a corresponding index of confidence. Taking into account the health effects of 222Rn, our results have the potential to substantially improve the estimation of the effective dose from 222Rn delivered to the Swiss population.
Resumo:
Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.
Resumo:
PURPOSE: According to estimations around 230 people die as a result of radon exposure in Switzerland. This public health concern makes reliable indoor radon prediction and mapping methods necessary in order to improve risk communication to the public. The aim of this study was to develop an automated method to classify lithological units according to their radon characteristics and to develop mapping and predictive tools in order to improve local radon prediction. METHOD: About 240 000 indoor radon concentration (IRC) measurements in about 150 000 buildings were available for our analysis. The automated classification of lithological units was based on k-medoids clustering via pair-wise Kolmogorov distances between IRC distributions of lithological units. For IRC mapping and prediction we used random forests and Bayesian additive regression trees (BART). RESULTS: The automated classification groups lithological units well in terms of their IRC characteristics. Especially the IRC differences in metamorphic rocks like gneiss are well revealed by this method. The maps produced by random forests soundly represent the regional difference of IRCs in Switzerland and improve the spatial detail compared to existing approaches. We could explain 33% of the variations in IRC data with random forests. Additionally, the influence of a variable evaluated by random forests shows that building characteristics are less important predictors for IRCs than spatial/geological influences. BART could explain 29% of IRC variability and produced maps that indicate the prediction uncertainty. CONCLUSION: Ensemble regression trees are a powerful tool to model and understand the multidimensional influences on IRCs. Automatic clustering of lithological units complements this method by facilitating the interpretation of radon properties of rock types. This study provides an important element for radon risk communication. Future approaches should consider taking into account further variables like soil gas radon measurements as well as more detailed geological information.
Resumo:
During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.
Resumo:
The ongoing global financial crisis has demonstrated the importance of a systemwide, or macroprudential, approach to safeguarding financial stability. An essential part of macroprudential oversight concerns the tasks of early identification and assessment of risks and vulnerabilities that eventually may lead to a systemic financial crisis. Thriving tools are crucial as they allow early policy actions to decrease or prevent further build-up of risks or to otherwise enhance the shock absorption capacity of the financial system. In the literature, three types of systemic risk can be identified: i ) build-up of widespread imbalances, ii ) exogenous aggregate shocks, and iii ) contagion. Accordingly, the systemic risks are matched by three categories of analytical methods for decision support: i ) early-warning, ii ) macro stress-testing, and iii ) contagion models. Stimulated by the prolonged global financial crisis, today's toolbox of analytical methods includes a wide range of innovative solutions to the two tasks of risk identification and risk assessment. Yet, the literature lacks a focus on the task of risk communication. This thesis discusses macroprudential oversight from the viewpoint of all three tasks: Within analytical tools for risk identification and risk assessment, the focus concerns a tight integration of means for risk communication. Data and dimension reduction methods, and their combinations, hold promise for representing multivariate data structures in easily understandable formats. The overall task of this thesis is to represent high-dimensional data concerning financial entities on lowdimensional displays. The low-dimensional representations have two subtasks: i ) to function as a display for individual data concerning entities and their time series, and ii ) to use the display as a basis to which additional information can be linked. The final nuance of the task is, however, set by the needs of the domain, data and methods. The following ve questions comprise subsequent steps addressed in the process of this thesis: 1. What are the needs for macroprudential oversight? 2. What form do macroprudential data take? 3. Which data and dimension reduction methods hold most promise for the task? 4. How should the methods be extended and enhanced for the task? 5. How should the methods and their extensions be applied to the task? Based upon the Self-Organizing Map (SOM), this thesis not only creates the Self-Organizing Financial Stability Map (SOFSM), but also lays out a general framework for mapping the state of financial stability. This thesis also introduces three extensions to the standard SOM for enhancing the visualization and extraction of information: i ) fuzzifications, ii ) transition probabilities, and iii ) network analysis. Thus, the SOFSM functions as a display for risk identification, on top of which risk assessments can be illustrated. In addition, this thesis puts forward the Self-Organizing Time Map (SOTM) to provide means for visual dynamic clustering, which in the context of macroprudential oversight concerns the identification of cross-sectional changes in risks and vulnerabilities over time. Rather than automated analysis, the aim of visual means for identifying and assessing risks is to support disciplined and structured judgmental analysis based upon policymakers' experience and domain intelligence, as well as external risk communication.
Resumo:
This study discusses the evolution of an omni-channel model in managing customer experience. The purpose of this thesis is to expand the current academic literature available on omni-channel and offer suggestions for omni-channel creation. This is done by studying the features of an omni-channel approach into engaging with customers and through the sub-objectives of describing the process behind its initiation as well as the special features communication service providers need to take in consideration. Theories used as a background for this study are related to customer experience, channel management, omni-channel and finally change management. The empirical study of this thesis consists of seven expert interviews conducted in a case company. The interviews were held between March and November 2014. One of the interviewees is the manager of an omni-channel development team, whilst the rest were in charge of the management of the various customer channels of the company. The organization and analysis of the interview data was conducted topically. The use of themes related to major theories on the subject was utilized to create linkages between theory and practice. The responses were also organized in two groups based on the viewpoint to map responses related to the company perspective as well as the customers´ perspective. The findings in this study are that omni-channel is among the best tools for companies to respond to the challenge induced by changing customer needs and preferences, as well as intensifying competitive environment. The omni-channel model was found to promote excellent customer experience and thus to be a source of competition advantage and increasing financial returns by creating an omni-experience for the customer. Through omniexperience customers see all of the transactions with a company presenting one brand and providing ease and effortlessness in every encounter. The processes behind omni-channel formulation were identified as customer experience proclaimed as the most important strategic goal, mapping and establishing a unified brand experience in all (service) channels and empowering the first line personnel as the gate keepers of omniexperience. Further the tools, measurement and supporting strategies were to be in accordance with the omni-channel strategy and the customer needs to become a partner in a two way transaction with the firm. Based on these findings a model for omni-channel creation is offered. Future research is needed to firstly, further test these findings and expand the theoretical framework on omni-channel, as it is quite scarce to date and secondly, to increase the generalizability of the model suggested.
Resumo:
Los sistemas productivos de las empresas han de adaptarse a las exigencias de los mercados. El Value Stream Mapping (VSM) es una técnica desarrollada por la Producción Ajustada y orientada al rediseño de dichos sistemas productivos. Si bien existe divulgación teórica sobre la técnica así como publicaciones de casos prácticos exitosos, se detecta la carencia de un análisis que explore en profundidad la aplicabilidad de la técnica en entornos productivos relacionados con las lineas de flujo desconectadas. Así, el objetivo de la tesis es la evaluación de la aplicabilidad del VSM en dichos entornos. El método de investigación adoptado ha consistido en un estudio de casos múltiple sobre seis empresas. Los resultados confirman la validez práctica del VSM para el rediseño de sistemas productivos. No obstante, también se fijan aspectos de mejora y desarrollo para que la técnica pueda convertirse en la referencia base.
Resumo:
The themes of awareness and influence within the innovation diffusion process are addressed. The innovation diffusion process is typically represented as stages, yet awareness and influence are somewhat under-represented in the literature. Awareness and influence are situated within the contextual setting of individual actors but also within the broader institutional forces. Understanding how actors become aware of an innovation and then how their opinion is influenced is important for creating a more innovation-active UK construction sector. Social network analysis is proposed as one technique for mapping how awareness and influence occur and what they look like as a network. Empirical data are gathered using two modes of enquiry. This is done through a pilot study consisting of chartered professionals and then through a case study organization as it attempted to diffuse an innovation. The analysis demonstrates significant variations across actors’ awareness and influence networks. It is argued that social network analysis can complement other research methods in order to present a richer picture of how actors become aware of innovations and where they draw their influences regarding adopting innovations. In summarizing the findings, a framework for understanding awareness and influence associated with innovation within the UK construction sector is presented. Finally, with the UK construction sector continually being encouraged to be innovative, understanding and managing an actor’s awareness and influence network will be beneficial. The overarching conclusion thus describes the need not only to build research capacity in this area but also to push the boundaries related to the research methods employed.