926 resultados para Model-Data Integration and Data Assimilation
Resumo:
Despite much anecdotal and oftentimes empirical evidence that black and ethnic minority employees do not feel integrated into organisational life and the implications of this lack of integration for their career progression, there is a dearth of research on the nature of the relationship black and ethnic minority employees have with their employing organisations. Additionally, research examining the relationship between diversity management and work outcomes has returned mixed findings. Scholars have attributed this to the lack of an empirically validated measure of workforce diversity management. Accordingly, I sought to address these gaps in the extant literature in a two-part study grounded in social exchange theory. In Study 1, I developed and validated a measure of workforce diversity management practices. Data obtained from a sample of ethnic minority employees from a cross section of organisations provided support for the validity of the scale. In Study 2, I proposed and tested a social-exchange-based model of the relationship between black and ethnic minority employees’ and their employing organisations, as well as assessed the implications of this relationship for their work outcomes. Specifically, I hypothesised: (i) perception of support for diversity, perception of overall justice, and developmental experiences (indicators of integration into organisational life) as mediators of the relationship between diversity management and social exchange with organisation; (ii) the moderating influence of diversity climate on the relationship between diversity management and these indicators of integration; and (iii) the work outcomes of social exchange with organisation defined in terms of career satisfaction, turnover intention and strain. SEM results provide support for most of the hypothesised relationships. The findings of the study contribute to the literature on workforce diversity management in a number of ways. First, the development and validation of a diversity management practice scale constitutes a first step in resolving the difficulty in operationalising and measuring the diversity management construct. Second, it explicates how and why diversity management practices influence a social exchange relationship with an employing organisation, and the implications of this relationship for the work outcomes of black and ethnic minority employees. My study’s focus on employee work outcomes is an important corrective to the predominant focus on organisational-level outcomes of diversity management. Lastly, by focusing on ethno-racial diversity my research complements the extant research on such workforce diversity indicators as age and gender.
Resumo:
Despite concerted academic interest in the strategic decision-making process (SDMP) since the 1980s, a coherent body of theory capable of guiding practice has not materialised. This is because many prior studies focus only on a single process characteristic, often rationality or comprehensiveness, and have paid insufficient attention to context. To further develop theory, research is required which examines: (i) the influence of context from multiple theoretical perspectives (e.g. upper echelons, environmental determinism); (ii) different process characteristics from both synoptic formal (e.g. rationality) and political incremental (e.g. politics) perspectives, and; (iii) the effects of context and process characteristics on a range of SDMP outcomes. Using data from 30 interviews and 357 questionnaires, this thesis addresses several opportunities for theory development by testing an integrative model which incorporates: (i) five SDMP characteristics representing both synoptic formal (procedural rationality, comprehensiveness, and behavioural integration) and political incremental (intuition, and political behaviour) perspectives; (ii) four SDMP outcome variables—strategic decision (SD) quality, implementation success, commitment, and SD speed, and; (iii) contextual variables from the four theoretical perspectives—upper echelons, SD-specific characteristics, environmental determinism, and firm characteristics. The present study makes several substantial and original contributions to knowledge. First, it provides empirical evidence of the contextual boundary conditions under which intuition and political behaviour positively influence SDMP outcomes. Second, it establishes the predominance of the upper echelons perspective; with TMT variables explaining significantly more variance in SDMP characteristics than SD specific characteristics, the external environment, and firm characteristics. A newly developed measure of top management team expertise also demonstrates highly significant direct and indirect effects on the SDMP. Finally, it is evident that SDMP characteristics and contextual variables influence a number of SDMP outcomes, not just overall SD quality, but also implementation success, commitment, and SD speed.
Resumo:
The methods and software for integration of databases (DBs) on inorganic material and substance properties have been developed. The information systems integration is based on known approaches combination: EII (Enterprise Information Integration) and EAI (Enterprise Application Integration). The metabase - special database that stores data on integrated DBs contents is an integrated system kernel. Proposed methods have been applied for DBs integrated system creation in the field of inorganic chemistry and materials science. Important developed integrated system feature is ability to include DBs that have been created by means of different DBMS using essentially various computer platforms: Sun (DB "Diagram") and Intel (other DBs) and diverse operating systems: Sun Solaris (DB "Diagram") and Microsoft Windows Server (other DBs).
Resumo:
This paper considers the problem of low-dimensional visualisation of very high dimensional information sources for the purpose of situation awareness in the maritime environment. In response to the requirement for human decision support aids to reduce information overload (and specifically, data amenable to inter-point relative similarity measures) appropriate to the below-water maritime domain, we are investigating a preliminary prototype topographic visualisation model. The focus of the current paper is on the mathematical problem of exploiting a relative dissimilarity representation of signals in a visual informatics mapping model, driven by real-world sonar systems. A realistic noise model is explored and incorporated into non-linear and topographic visualisation algorithms building on the approach of [9]. Concepts are illustrated using a real world dataset of 32 hydrophones monitoring a shallow-water environment in which targets are present and dynamic.
Resumo:
How are the image statistics of global image contrast computed? We answered this by using a contrast-matching task for checkerboard configurations of ‘battenberg’ micro-patterns where the contrasts and spatial spreads of interdigitated pairs of micro-patterns were adjusted independently. Test stimuli were 20 × 20 arrays with various sized cluster widths, matched to standard patterns of uniform contrast. When one of the test patterns contained a pattern with much higher contrast than the other, that determined global pattern contrast, as in a max() operation. Crucially, however, the full matching functions had a curious intermediate region where low contrast additions for one pattern to intermediate contrasts of the other caused a paradoxical reduction in perceived global contrast. None of the following models predicted this: RMS, energy, linear sum, max, Legge and Foley. However, a gain control model incorporating wide-field integration and suppression of nonlinear contrast responses predicted the results with no free parameters. This model was derived from experiments on summation of contrast at threshold, and masking and summation effects in dipper functions. Those experiments were also inconsistent with the failed models above. Thus, we conclude that our contrast gain control model (Meese & Summers, 2007) describes a fundamental operation in human contrast vision.
Resumo:
Using plant level data from a global survey with multiple time frames, one begun in the late 1990s, this paper introduces measures of supply chain integration and discusses the dynamic relationship between the level of integration and a set of internal and external performance measurements. Specifically, data from Hungary, The Netherlands and The People’s Republic of China are used in the analyses. The time frames considered range from the late 1990s till 2009, encompassing major changes and transitions. Our results seem to indicate that SCI has an underlying structure of four sets of indicators, namely: (1) delivery frequency from the supplier or to the customer; (2) sharing internal processes with suppliers; (3) sharing internal processes with buyers and (4) joint facility location with partners. The differences between groups in terms of several performance measures proved to be small, being mostly statistically insignificant - but looking at the ANOVA table we can conclude that in this sample of companies those having joint location with their partners seem to outperform others.
Resumo:
The mediator software architecture design has been developed to provide data integration and retrieval in distributed, heterogeneous environments. Since the initial conceptualization of this architecture, many new technologies have emerged that can facilitate the implementation of this design. The purpose of this thesis was to show that a mediator framework supporting users of mobile devices could be implemented using common software technologies available today. In addition, the prototype was developed with a view to providing a better understanding of what a mediator is and to expose issues that will have to be addressed in full, more robust designs. The prototype developed for this thesis was implemented using various technologies including: Java, XML, and Simple Object Access Protocol (SOAP) among others. SOAP was used to accomplish inter-process communication. In the end, it is expected that more data intensive software applications will be possible in a world with ever-increasing demands for information.
Resumo:
We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.
Resumo:
In this thesis, research for tsunami remote sensing using the Global Navigation Satellite System-Reflectometry (GNSS-R) delay-Doppler maps (DDMs) is presented. Firstly, a process for simulating GNSS-R DDMs of a tsunami-dominated sea sur- face is described. In this method, the bistatic scattering Zavorotny-Voronovich (Z-V) model, the sea surface mean square slope model of Cox and Munk, and the tsunami- induced wind perturbation model are employed. The feasibility of the Cox and Munk model under a tsunami scenario is examined by comparing the Cox and Munk model- based scattering coefficient with the Jason-1 measurement. A good consistency be- tween these two results is obtained with a correlation coefficient of 0.93. After con- firming the applicability of the Cox and Munk model for a tsunami-dominated sea, this work provides the simulations of the scattering coefficient distribution and the corresponding DDMs of a fixed region of interest before and during the tsunami. Fur- thermore, by subtracting the simulation results that are free of tsunami from those with presence of tsunami, the tsunami-induced variations in scattering coefficients and DDMs can be clearly observed. Secondly, a scheme to detect tsunamis and estimate tsunami parameters from such tsunami-dominant sea surface DDMs is developed. As a first step, a procedure to de- termine tsunami-induced sea surface height anomalies (SSHAs) from DDMs is demon- strated and a tsunami detection precept is proposed. Subsequently, the tsunami parameters (wave amplitude, direction and speed of propagation, wavelength, and the tsunami source location) are estimated based upon the detected tsunami-induced SSHAs. In application, the sea surface scattering coefficients are unambiguously re- trieved by employing the spatial integration approach (SIA) and the dual-antenna technique. Next, the effective wind speed distribution can be restored from the scat- tering coefficients. Assuming all DDMs are of a tsunami-dominated sea surface, the tsunami-induced SSHAs can be derived with the knowledge of background wind speed distribution. In addition, the SSHA distribution resulting from the tsunami-free DDM (which is supposed to be zero) is considered as an error map introduced during the overall retrieving stage and is utilized to mitigate such errors from influencing sub- sequent SSHA results. In particular, a tsunami detection procedure is conducted to judge the SSHAs to be truly tsunami-induced or not through a fitting process, which makes it possible to decrease the false alarm. After this step, tsunami parameter estimation is proceeded based upon the fitted results in the former tsunami detec- tion procedure. Moreover, an additional method is proposed for estimating tsunami propagation velocity and is believed to be more desirable in real-world scenarios. The above-mentioned tsunami-dominated sea surface DDM simulation, tsunami detection precept and parameter estimation have been tested with simulated data based on the 2004 Sumatra-Andaman tsunami event.
Resumo:
We compare a compilation of 220 sediment core d13C data from the glacial Atlantic Ocean with three-dimensional ocean circulation simulations including a marine carbon cycle model. The carbon cycle model employs circulation fields which were derived from previous climate simulations. All sediment data have been thoroughly quality controlled, focusing on epibenthic foraminiferal species (such as Cibicidoides wuellerstorfi or Planulina ariminensis) to improve the comparability of model and sediment core carbon isotopes. The model captures the general d13C pattern indicated by present-day water column data and Late Holocene sediment cores but underestimates intermediate and deep water values in the South Atlantic. The best agreement with glacial reconstructions is obtained for a model scenario with an altered freshwater balance in the Southern Ocean that mimics enhanced northward sea ice export and melting away from the zone of sea ice production. This results in a shoaled and weakened North Atlantic Deep Water flow and intensified Antarctic Bottom Water export, hence confirming previous reconstructions from paleoproxy records. Moreover, the modeled abyssal ocean is very cold and very saline, which is in line with other proxy data evidence.
Resumo:
Research on the perception of temporal order uses either temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks, in both of which two stimuli are presented with some temporal delay and observers must judge the order of presentation. Results generally differ across tasks, raising concerns about whether they measure the same processes. We present a model including sensory and decisional parameters that places these tasks in a common framework that allows studying their implications on observed performance. TOJ tasks imply specific decisional components that explain the discrepancy of results obtained with TOJ and SJ tasks. The model is also tested against published data on audiovisual temporal-order judgments, and the fit is satisfactory, although model parameters are more accurately estimated with SJ tasks. Measures of latent point of subjective simultaneity and latent sensitivity are defined that are invariant across tasks by isolating the sensory parameters governing observed performance, whereas decisional parameters vary across tasks and account for observed differences across them. Our analyses concur with other evidence advising against the use of TOJ tasks in research on perception of temporal order.
Resumo:
This paper explores the dynamics of inter-sectoral technological integration by introducing the concept of bridging platform as a node of pervasive technologies, whose collective broad applicability may enhance the connection between ‘distant’ knowledge by offering a technological coupling. Using data on patents obtained from the CRIOS-PATSTAT database for four EU countries (Germany, UK, France and Italy), we provide empirical evidence that bridging platforms are likely to connect more effectively innovations across distant technological domains, fostering inter-sectoral technological integration and the development of original innovation. Public research organisations are also found to play a crucial role in terms of technological integration and original innovation due to their higher capacity to access and use bridging platforms within their innovation activities.
Resumo:
Transient simulations are widely used in studying the past climate as they provide better comparison with any exisiting proxy data. However, multi-millennial transient simulations using coupled climate models are usually computationally very expensive. As a result several acceleration techniques are implemented when using numerical simulations to recreate past climate. In this study, we compare the results from transient simulations of the present and the last interglacial with and without acceleration of the orbital forcing, using the comprehensive coupled climate model CCSM3 (Community Climate System Model 3). Our study shows that in low-latitude regions, the simulation of long-term variations in interglacial surface climate is not significantly affected by the use of the acceleration technique (with an acceleration factor of 10) and hence, large-scale model-data comparison of surface variables is not hampered. However, in high-latitude regions where the surface climate has a direct connection to the deep ocean, e.g. in the Southern Ocean or the Nordic Seas, acceleration-induced biases in sea-surface temperature evolution may occur with potential influence on the dynamics of the overlying atmosphere. The data provided here are from both accelerated and non-accelerated runs as decadal mean values.
Resumo:
Abstract: Decision support systems have been widely used for years in companies to gain insights from internal data, thus making successful decisions. Lately, thanks to the increasing availability of open data, these systems are also integrating open data to enrich decision making process with external data. On the other hand, within an open-data scenario, decision support systems can be also useful to decide which data should be opened, not only by considering technical or legal constraints, but other requirements, such as "reusing potential" of data. In this talk, we focus on both issues: (i) open data for decision making, and (ii) decision making for opening data. We will first briefly comment some research problems regarding using open data for decision making. Then, we will give an outline of a novel decision-making approach (based on how open data is being actually used in open-source projects hosted in Github) for supporting open data publication. Bio of the speaker: Jose-Norberto Mazón holds a PhD from the University of Alicante (Spain). He is head of the "Cátedra Telefónica" on Big Data and coordinator of the Computing degree at the University of Alicante. He is also member of the WaKe research group at the University of Alicante. His research work focuses on open data management, data integration and business intelligence within "big data" scenarios, and their application to the tourism domain (smart tourism destinations). He has published his research in international journals, such as Decision Support Systems, Information Sciences, Data & Knowledge Engineering or ACM Transaction on the Web. Finally, he is involved in the open data project in the University of Alicante, including its open data portal at http://datos.ua.es
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08