911 resultados para CIDOC Conceptual Reference Model
Resumo:
Human Resource (HR) systems and practices generally referred to as High Performance Work Practices (HPWPs), (Huselid, 1995) (sometimes termed High Commitment Work Practices or High Involvement Work Practices) have attracted much research attention in past decades. Although many conceptualizations of the construct have been proposed, there is general agreement that HPWPs encompass a bundle or set of HR practices including sophisticated staffing, intensive training and development, incentive-based compensation, performance management, initiatives aimed at increasing employee participation and involvement, job safety and security, and work design (e.g. Pfeffer, 1998). It is argued that these practices either directly and indirectly influence the extent to which employees’ knowledge, skills, abilities, and other characteristics are utilized in the organization. Research spanning nearly 20 years has provided considerable empirical evidence for relationships between HPWPs and various measures of performance including increased productivity, improved customer service, and reduced turnover (e.g. Guthrie, 2001; Belt & Giles, 2009). With the exception of a few papers (e.g., Laursen &Foss, 2003), this literature appears to lack focus on how HPWPs influence or foster more innovative-related attitudes and behaviours, extra role behaviors, and performance. This situation exists despite the vast evidence demonstrating the importance of innovation, proactivity, and creativity in its various forms to individual, group, and organizational performance outcomes. Several pertinent issues arise when considering HPWPs and their relationship to innovation and performance outcomes. At a broad level is the issue of which HPWPs are related to which innovation-related variables. Another issue not well identified in research relates to employees’ perceptions of HPWPs: does an employee actually perceive the HPWP –outcomes relationship? No matter how well HPWPs are designed, if they are not perceived and experienced by employees to be effective or worthwhile then their likely success in achieving positive outcomes is limited. At another level, research needs to consider the mechanisms through which HPWPs influence –innovation and performance. The research question here relates to what possible mediating variables are important to the success or failure of HPWPs in impacting innovative behaviours and attitudes and what are the potential process considerations? These questions call for theory refinement and the development of more comprehensive models of the HPWP-innovation/performance relationship that include intermediate linkages and boundary conditions (Ferris, Hochwarter, Buckley, Harrell-Cook, & Frink, 1999). While there are many calls for this type of research to be made a high priority, to date, researchers have made few inroads into answering these questions. This symposium brings together researchers from Australia, Europe, Asia and Africa to examine these various questions relating to the HPWP-innovation-performance relationship. Each paper discusses a HPWP and potential variables that can facilitate or hinder the effects of these practices on innovation- and performance- related outcomes. The first paper by Johnston and Becker explores the HPWPs in relation to work design in a disaster response organization that shifts quickly from business as usual to rapid response. The researchers examine how the enactment of the organizational response is devolved to groups and individuals. Moreover, they assess motivational characteristics that exist in dual work designs (normal operations and periods of disaster activation) and the implications for innovation. The second paper by Jørgensen reports the results of an investigation into training and development practices and innovative work behaviors (IWBs) in Danish organizations. Research on how to design and implement training and development initiatives to support IWBs and innovation in general is surprisingly scant and often vague. This research investigates the mechanisms by which training and development initiatives influence employee behaviors associated with innovation, and provides insights into how training and development can be used effectively by firms to attract and retain valuable human capital in knowledge-intensive firms. The next two papers in this symposium consider the role of employee perceptions of HPWPs and their relationships to innovation-related variables and performance. First, Bish and Newton examine perceptions of the characteristics and awareness of occupational health and safety (OHS) practices and their relationship to individual level adaptability and proactivity in an Australian public service organization. The authors explore the role of perceived supportive and visionary leadership and its impact on the OHS policy-adaptability/proactivity relationship. The study highlights the positive main effects of awareness and characteristics of OHS polices, and supportive and visionary leadership on individual adaptability and proactivity. It also highlights the important moderating effects of leadership in the OHS policy-adaptability/proactivity relationship. Okhawere and Davis present a conceptual model developed for a Nigerian study in the safety-critical oil and gas industry that takes a multi-level approach to the HPWP-safety relationship. Adopting a social exchange perspective, they propose that at the organizational level, organizational climate for safety mediates the relationship between enacted HPWS’s and organizational safety performance (prescribed and extra role performance). At the individual level, the experience of HPWP impacts on individual behaviors and attitudes in organizations, here operationalized as safety knowledge, skills and motivation, and these influence individual safety performance. However these latter relationships are moderated by organizational climate for safety. A positive organizational climate for safety strengthens the relationship between individual safety behaviors and attitudes and individual-level safety performance, therefore suggesting a cross-level boundary condition. The model includes both safety performance (behaviors) and organizational level safety outcomes, operationalized as accidents, injuries, and fatalities. The final paper of this symposium by Zhang and Liu explores leader development and relationship between transformational leadership and employee creativity and innovation in China. The authors further develop a model that incorporates the effects of extrinsic motivation (pay for performance: PFP) and employee collectivism in the leader-employee creativity relationship. The papers’ contributions include the incorporation of a PFP effect on creativity as moderator, rather than predictor in most studies; the exploration of the PFP effect from both fairness and strength perspectives; the advancement of knowledge on the impact of collectivism on the leader- employee creativity link. Last, this is the first study to examine three-way interactional effects among leader-member exchange (LMX), PFP and collectivism, thus, enriches our understanding of promoting employee creativity. In conclusion, this symposium draws upon the findings of four empirical studies and one conceptual study to provide an insight into understanding how different variables facilitate or potentially hinder the influence various HPWPs on innovation and performance. We will propose a number of questions for further consideration and discussion. The symposium will address the Conference Theme of ‘Capitalism in Question' by highlighting how HPWPs can promote financial health and performance of organizations while maintaining a high level of regard and respect for employees and organizational stakeholders. Furthermore, the focus on different countries and cultures explores the overall research question in relation to different modes or stages of development of capitalism.
Resumo:
In this rejoinder, we provide a response to the three commentaries written by Diamantopoulos, Howell, and Rigdon (all this issue) on our paper The MIMIC Model and Formative Variables: Problems and Solutions (also this issue). We contrast the approach taken in the latter paper (where we focus on clarifying the assumptions required to reject the formative MIMIC model) by spending time discussing what assumptions would be necessary to accept the use of the formative MIMIC model as a viable approach. Importantly, we clarify the implications of entity realism and show how it is entirely logical that some theoretical constructs can be considered to have real existence independent of their indicators, and some cannot. We show how the formative model only logically holds when considering these ‘unreal’ entities. In doing so, we provide important counter-arguments for much of the criticisms made in Diamantopoulos’ commentary, and the distinction also helps clarify a number of issues in the commentaries of Howell and Rigdon (both of which in general agree with our original paper). We draw together these various threads to provide a set of conceptual tools researchers can use when thinking about the entities in their theoretical models.
Resumo:
In recent decades, a number of sustainable strategies and polices have been created to protect and preserve our water environments from the impacts of growing communities. The Australian approach, Water Sensitive Urban Design (WSUD), defined as the integration of urban planning and design with the urban water cycle management, has made considerable advances on design guidelines since 2000. WSUD stormwater management systems (e.g. wetlands, bioretentions, porous pavement etc), also known as Best Management Practices (BMPs) or Low Impact Development (LID), are slowly gaining popularity across Australia, the USA and Europe. There have also been significant improvements in how to model the performance of the WSUD technologies (e.g. MUSIC software). However, the implementation issues of these WSUD practices are mainly related to ongoing institutional capacity. Some of the key problems are associated with a limited awareness of urban planners and designers; in general, they have very little knowledge of these systems and their benefits to the urban environments. At the same time, hydrological engineers should have a better understanding of building codes and master plans. The land use regulations are equally as important as the physical site conditions for determining opportunities and constraints for implementing WSUD techniques. There is a need for procedures that can make a better linkage between urban planners and WSUD engineering practices. Thus, this paper aims to present the development of a general framework for incorporating WSUD technologies into the site planning process. The study was applied to lot-scale in the Melbourne region, Australia. Results show the potential space available for fitting WSUD elements, according to building requirements and different types of housing densities. © 2011 WIT Press.
Resumo:
In this paper a full analytic model for pause intensity (PI), a no-reference metric for video quality assessment, is presented. The model is built upon the video play out buffer behavior at the client side and also encompasses the characteristics of a TCP network. Video streaming via TCP produces impairments in play continuity, which are not typically reflected in current objective metrics such as PSNR and SSIM. Recently the buffer under run frequency/probability has been used to characterize the buffer behavior and as a measurement for performance optimization. But we show, using subjective testing, that under run frequency cannot reflect the viewers' quality of experience for TCP based streaming. We also demonstrate that PI is a comprehensive metric made up of a combination of phenomena observed in the play out buffer. The analytical model in this work is verified with simulations carried out on ns-2, showing that the two results are closely matched. The effectiveness of the PI metric has also been proved by subjective testing on a range of video clips, where PI values exhibit a good correlation with the viewers' opinion scores. © 2012 IEEE.
Resumo:
Setting out from the database of Operophtera brumata, L. in between 1973 and 2000 due to the Light Trap Network in Hungary, we introduce a simple theta-logistic population dynamical model based on endogenous and exogenous factors, only. We create an indicator set from which we can choose some elements with which we can improve the fitting results the most effectively. Than we extend the basic simple model with additive climatic factors. The parameter optimization is based on the minimized root mean square error. The best model is chosen according to the Akaike Information Criterion. Finally we run the calibrated extended model with daily outputs of the regional climate model RegCM3.1, regarding 1961-1990 as reference period and 2021-2050 with 2071-2100 as future predictions. The results of the three time intervals are fitted with Beta distributions and compared statistically. The expected changes are discussed.
Resumo:
The search-experience-credence framework from economics of information, the human-environment relations models from environmental psychology, and the consumer evaluation process from services marketing provide a conceptual basis for testing the model of "Pre-purchase Information Utilization in Service Physical Environments." The model addresses the effects of informational signs, as a dimension of the service physical environment, on consumers' perceptions (perceived veracity and perceived performance risk), emotions (pleasure) and behavior (willingness to buy). The informational signs provide attribute quality information (search and experience) through non-personal sources of information (simulated word-of-mouth and non-personal advocate sources).^ This dissertation examines: (1) the hypothesized relationships addressed in the model of "Pre-purchase Information Utilization in Service Physical Environments" among informational signs, perceived veracity, perceived performance risk, pleasure, and willingness to buy, and (2) the effects of attribute quality information and sources of information on consumers' perceived veracity and perceived performance risk.^ This research is the first in-depth study about the role and effects of information in service physical environments. Using a 2 x 2 between subjects experimental research procedure, undergraduate students were exposed to the informational signs in a simulated service physical environment. The service physical environments were simulated through color photographic slides.^ The results of the study suggest that: (1) the relationship between informational signs and willingness to buy is mediated by perceived veracity, perceived performance risk and pleasure, (2) experience attribute information shows higher perceived veracity and lower perceived performance risk when compared to search attribute information, and (3) information provided through simulated word-of-mouth shows higher perceived veracity and lower perceived performance risk when compared to information provided through non-personal advocate sources. ^
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier-Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.
Resumo:
I conducted this study to provide insights toward deepening understanding of association between culture and writing by building, assessing, and refining a conceptual model of second language writing. To do this, I examined culture and coherence as well as the relationship between them through a mixed methods research design. Coherence has been an important and complex concept in ESL/EFL writing. I intended to study the concept of coherence in the research context of contrastive rhetoric, comparing the coherence quality in argumentative essays written by undergraduates in Mainland China and their U.S. peers. In order to analyze the complex concept of coherence, I synthesized five linguistic theories of coherence: Halliday and Hasan's cohesion theory, Carroll's theory of coherence, Enkvist's theory of coherence, Topical Structure Analysis, and Toulmin's Model. Based upon the synthesis, 16 variables were generated. Across these 16 variables, Hotelling t-test statistical analysis was conducted to predict differences in argumentative coherence between essays written by two groups of participants. In order to complement the statistical analysis, I conducted 30 interviews of the writers in the studies. Participants' responses were analyzed with open and axial coding. By analyzing the empirical data, I refined the conceptual model by adding more categories and establishing associations among them. The study found that U.S. students made use of more pronominal reference. Chinese students adopted more lexical devices of reiteration and extended paralleling progression. The interview data implied that the difference may be associated with the difference in linguistic features and rhetorical conventions in Chinese and English. As far as Toulmin's Model is concerned, Chinese students scored higher on data than their U.S. peers. According to the interview data, this may be due to the fact that Toulmin's Model, modified as three elements of arguments, have been widely and long taught in Chinese writing instruction while U.S. interview participants said that they were not taught to write essays according to Toulmin's Model. Implications were generated from the process of textual data analysis and the formulation of structural model defining coherence. These implications were aimed at informing writing instruction, assessment, peer-review, and self-revision.
Resumo:
This paper explores the connection between leadership behaviors and employee engagement to build a proposed conceptual model. A conceptual link between employee needs (Herzberg, 1959; Maslow, 1970), emotional intelligence (Goleman, 1998), and transformational leadership (Bass, 1985) is discussed.
Resumo:
Conceptual database design is an unusually difficult and error-prone task for novice designers. This study examined how two training approaches---rule-based and pattern-based---might improve performance on database design tasks. A rule-based approach prescribes a sequence of rules for modeling conceptual constructs, and the action to be taken at various stages while developing a conceptual model. A pattern-based approach presents data modeling structures that occur frequently in practice, and prescribes guidelines on how to recognize and use these structures. This study describes the conceptual framework, experimental design, and results of a laboratory experiment that employed novice designers to compare the effectiveness of the two training approaches (between-subjects) at three levels of task complexity (within subjects). Results indicate an interaction effect between treatment and task complexity. The rule-based approach was significantly better in the low-complexity and the high-complexity cases; there was no statistical difference in the medium-complexity case. Designer performance fell significantly as complexity increased. Overall, though the rule-based approach was not significantly superior to the pattern-based approach in all instances, it out-performed the pattern-based approach at two out of three complexity levels. The primary contributions of the study are (1) the operationalization of the complexity construct to a degree not addressed in previous studies; (2) the development of a pattern-based instructional approach to database design; and (3) the finding that the effectiveness of a particular training approach may depend on the complexity of the task.
Resumo:
Ensemble Stream Modeling and Data-cleaning are sensor information processing systems have different training and testing methods by which their goals are cross-validated. This research examines a mechanism, which seeks to extract novel patterns by generating ensembles from data. The main goal of label-less stream processing is to process the sensed events to eliminate the noises that are uncorrelated, and choose the most likely model without over fitting thus obtaining higher model confidence. Higher quality streams can be realized by combining many short streams into an ensemble which has the desired quality. The framework for the investigation is an existing data mining tool. First, to accommodate feature extraction such as a bush or natural forest-fire event we make an assumption of the burnt area (BA*), sensed ground truth as our target variable obtained from logs. Even though this is an obvious model choice the results are disappointing. The reasons for this are two: One, the histogram of fire activity is highly skewed. Two, the measured sensor parameters are highly correlated. Since using non descriptive features does not yield good results, we resort to temporal features. By doing so we carefully eliminate the averaging effects; the resulting histogram is more satisfactory and conceptual knowledge is learned from sensor streams. Second is the process of feature induction by cross-validating attributes with single or multi-target variables to minimize training error. We use F-measure score, which combines precision and accuracy to determine the false alarm rate of fire events. The multi-target data-cleaning trees use information purity of the target leaf-nodes to learn higher order features. A sensitive variance measure such as ƒ-test is performed during each node's split to select the best attribute. Ensemble stream model approach proved to improve when using complicated features with a simpler tree classifier. The ensemble framework for data-cleaning and the enhancements to quantify quality of fitness (30% spatial, 10% temporal, and 90% mobility reduction) of sensor led to the formation of streams for sensor-enabled applications. Which further motivates the novelty of stream quality labeling and its importance in solving vast amounts of real-time mobile streams generated today.
Resumo:
The main objective of this work is to develop a quasi three-dimensional numerical model to simulate stony debris flows, considering a continuum fluid phase, composed by water and fine sediments, and a non-continuum phase including large particles, such as pebbles and boulders. Large particles are treated in a Lagrangian frame of reference using the Discrete Element Method, the fluid phase is based on the Eulerian approach, using the Finite Element Method to solve the depth-averaged Navier–Stokes equations in two horizontal dimensions. The particle’s equations of motion are in three dimensions. The model simulates particle-particle collisions and wall-particle collisions, taking into account that particles are immersed in a fluid. Bingham and Cross rheological models are used for the continuum phase. Both formulations provide very stable results, even in the range of very low shear rates. Bingham formulation is better able to simulate the stopping stage of the fluid when applied shear stresses are low. Results of numerical simulations have been compared with data from laboratory experiments on a flume-fan prototype. Results show that the model is capable of simulating the motion of big particles moving in the fluid flow, handling dense particulate flows and avoiding overlap among particles. An application to simulate debris flow events that occurred in Northern Venezuela in 1999 shows that the model could replicate the main boulder accumulation areas that were surveyed by the USGS. Uniqueness of this research is the integration of mud flow and stony debris movement in a single modeling tool that can be used for planning and management of debris flow prone areas.
Resumo:
Ocean acidification and associated shifts in carbonate chemistry speciation induced by increasing levels of atmospheric carbon dioxide (CO2) have the potential to impact marine biota in various ways. The process of biogenic calcification, for instance, is usually shown to be negatively affected. In coccolithophores, an important group of pelagic calcifiers, changes in cellular calcification rates in response to changing ocean carbonate chemistry appear to differ among species. By applying a wider CO2 range we show that a species previously reported insensitive to seawater acidification, Coccolithusbraarudii, responds both in terms of calcification and photosynthesis, although at higher levels of CO2. Thus, observed differences between species seem to be related to individual sensitivities while the underlying mechanisms could be the same. On this basis we develop a conceptual model of coccolithophorid calcification and photosynthesis in response to CO2-induced changes in seawater carbonate chemistry speciation.
Resumo:
The enterprise management approach provides a holistic view of organizations and their related information systems. In order to cope with the globalization, virtualization, and volatile competitive environment, traditional firms are seeking to reconstruct their organizational structures and establish new IS architectures to transform from single autonomous entities into more open enterprises supported by new Enterprise Resource Planning (ERP) systems. This paper reports on ERP engage-abilities within three different enterprise management patterns based on the theoretical foundations of the "Dynamic Enterprise Reference Grid". An exploratory inductive study in Zoomlion using the narrative research approach has been conducted. Also, this research delivers a conceptual framework to demonstrate the adoption of ERP in the three enterprise management structures and points to a new architectural type (ERPIII) for operating in the virtual enterprise paradigm. © 2010 Springer-Verlag.