879 resultados para two dimensions
Resumo:
This article proposes a framework to evaluate corporate environmental strategies. In the proposed framework, a company's environmental risks are analyzed on two dimensions, One dimension, the endogenous environmental risks, arises from the internal operations of the company. The other dimension, the exogenous environmental risks, are determined by the company's external world: its location, its ecological setting, and the demographic characteristics of the physical environment in which it operates. Four environmental management approaches are defined as a function of endogenous and exogenous environmental risks: reactive, proactive, strategic, and crisis preventive. The framework was applied in a survey of 141 company representatives in Hungary. A relationship was sought between the a priori defined environmental management approaches based on technology and location and the companies' environmental management characteristics defined by senior managers. Variables that differentiated among the four environmental management approaches were identified and ranked. The study concludes that there is a relatively well-defined relationship between the environmental risks of companies and the nature of their environmental management approaches, Implementing a strategic environmental management approach may not be the best option for all companies - although there is a growing pressure to do so.
Resumo:
The purpose of this research was to examine the influence of cultural dissimilarity on the relationship between multinationality and performance. Both direct and indirect effects were studied. In addition, the form of the multinationality-performance relationship was investigated.^ Five indicators of cultural dissimilarity were developed on the basis of Hofstede's cultural dimensions. Performance was measured along two dimensions--financial and operational. Multinationality was operationalized as the ratio of foreign sales to total sales. Secondary data was used for all variables in the study. The sample of firms comprised multinationals based in the United States from four global industries--chemicals, computers and office equipment, electrical and electrical goods, and drugs and pharmaceuticals.^ Regression analyses using pooled cross-section/time-series data indicated that the relationship between multinationality and performance is curvilinear. No direct effects of cultural dissimilarity on performance were found. However, the results show a moderating effect of cultural dissimilarity on the multinationality-performance relationship. The direction of this effect was positive for four of the five cultural dissimilarity measures. ^
Resumo:
This dissertation introduced substance abuse to the Dynamic Vulnerability Formulation (DVF) and the social competence model to determine if the relationship between schizophrenic symptomatology and coping ability in the DVF applied also to the dually diagnosed schizophrenic or if these variables needed to be modified. It compared the coping abilities of dually and singly diagnosed clients in day treatment and identified, examined, and assessed the relative influence of relevant mediating variables on two dimensions of coping ability of the dually diagnosed: coping skills and coping effort. These variables were: presence of negative and nonnegative symptoms, duration of mental illness, type of substance used, and age of first substance use.^ A priori effect sizes based on previous empirical research were used to interpret the results related to the comparison of demographic, socioeconomic, and treatment characteristics between the singly and dually diagnosed study samples. The data suggested that the singly diagnosed group had higher coping skills than the dually diagnosed group, particularly in the areas of housing stability, work affect, and total social adjustment. The dually diagnosed group had lower scores on one aspect of coping effort--agency or self-efficacy. The data supported the presence of an inverse relationship between symptom severity and coping skills, particularly for the dually diagnosed group. The data did not support the presence of an inverse relationship between symptom severity and coping effort, but did suggest a positive relationship between symptom severity and one measure of coping effort, agency, for the dually diagnosed group. Regression equations using each summary measure of coping skill--social adjustment and role functioning--yielded statistically significant F-ratios. Thirty-six percent of the variance in social adjustment and thirty-one percent of the variance in role functioning were explained by the relative influence of the relevant variables. Both negative and non-negative symptoms were the only significant predictors of social adjustment. The non-negative symptoms variable was the sole significant predictor of role functioning. The results of this study provided partial support for the use of the Dynamic Vulnerability Formulation (DVF) with the dually diagnosed. ^
Resumo:
A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.
Resumo:
Each disaster presents itself with a unique set of characteristics that are hard to determine a priori. Thus disaster management tasks are inherently uncertain, requiring knowledge sharing and quick decision making that involves coordination across different levels and collaborators. While there has been an increasing interest among both researchers and practitioners in utilizing knowledge management to improve disaster management, little research has been reported about how to assess the dynamic nature of disaster management tasks, and what kinds of knowledge sharing are appropriate for different dimensions of task uncertainty characteristics. ^ Using combinations of qualitative and quantitative methods, this research study developed the dimensions and their corresponding measures of the uncertain dynamic characteristics of disaster management tasks and tested the relationships between the various dimensions of uncertain dynamic disaster management tasks and task performance through the moderating and mediating effects of knowledge sharing. ^ Furthermore, this research work conceptualized and assessed task uncertainty along three dimensions: novelty, unanalyzability, and significance; knowledge sharing along two dimensions: knowledge sharing purposes and knowledge sharing mechanisms; and task performance along two dimensions: task effectiveness and task efficiency. Analysis results of survey data collected from Miami-Dade County emergency managers suggested that knowledge sharing purposes and knowledge sharing mechanisms moderate and mediate uncertain dynamic disaster management task and task performance. Implications for research and practice as well directions for future research are discussed.^
Resumo:
A novel modeling approach is applied to karst hydrology. Long-standing problems in karst hydrology and solute transport are addressed using Lattice Boltzmann methods (LBMs). These methods contrast with other modeling approaches that have been applied to karst hydrology. The motivation of this dissertation is to develop new computational models for solving ground water hydraulics and transport problems in karst aquifers, which are widespread around the globe. This research tests the viability of the LBM as a robust alternative numerical technique for solving large-scale hydrological problems. The LB models applied in this research are briefly reviewed and there is a discussion of implementation issues. The dissertation focuses on testing the LB models. The LBM is tested for two different types of inlet boundary conditions for solute transport in finite and effectively semi-infinite domains. The LBM solutions are verified against analytical solutions. Zero-diffusion transport and Taylor dispersion in slits are also simulated and compared against analytical solutions. These results demonstrate the LBM’s flexibility as a solute transport solver. The LBM is applied to simulate solute transport and fluid flow in porous media traversed by larger conduits. A LBM-based macroscopic flow solver (Darcy’s law-based) is linked with an anisotropic dispersion solver. Spatial breakthrough curves in one and two dimensions are fitted against the available analytical solutions. This provides a steady flow model with capabilities routinely found in ground water flow and transport models (e.g., the combination of MODFLOW and MT3D). However the new LBM-based model retains the ability to solve inertial flows that are characteristic of karst aquifer conduits. Transient flows in a confined aquifer are solved using two different LBM approaches. The analogy between Fick’s second law (diffusion equation) and the transient ground water flow equation is used to solve the transient head distribution. An altered-velocity flow solver with source/sink term is applied to simulate a drawdown curve. Hydraulic parameters like transmissivity and storage coefficient are linked with LB parameters. These capabilities complete the LBM’s effective treatment of the types of processes that are simulated by standard ground water models. The LB model is verified against field data for drawdown in a confined aquifer.
Resumo:
The theoretical construct of control has been defined as necessary (Etzioni, 1965), ubiquitous (Vickers, 1967), and on-going (E. Langer, 1983). Empirical measures, however, have not adequately given meaning to this potent construct, especially within complex organizations such as schools. Four stages of theory-development and empirical testing of school building managerial control using principals and teachers working within the nation's fourth largest district are presented in this dissertation as follows: (1) a review and synthesis of social science theories of control across the literatures of organizational theory, political science, sociology, psychology, and philosophy; (2) a systematic analysis of school managerial activities performed at the building level within the context of curricular and instructional tasks; (3) the development of a survey questionnaire to measure school building managerial control; and (4) initial tests of construct validity including inter-item reliability statistics, principal components analyses, and multivariate tests of significance. The social science synthesis provided support of four managerial control processes: standards, information, assessment, and incentives. The systematic analysis of school managerial activities led to further categorization between structural frequency of behaviors and discretionary qualities of behaviors across each of the control processes and the curricular and instructional tasks. Teacher survey responses (N=486) reported a significant difference between these two dimensions of control, structural frequency and discretionary qualities, for standards, information, and assessments, but not for incentives. The descriptive model of school managerial control suggests that (1) teachers perceive structural and discretionary managerial behaviors under information and incentives more clearly than activities representing standards or assessments, (2) standards are primarily structural while assessments are primarily qualitative, (3) teacher satisfaction is most closely related to the equitable distribution of incentives, (4) each of the structural managerial behaviors has a qualitative effect on teachers, and that (5) certain qualities of managerial behaviors are perceived by teachers as distinctly discretionary, apart from school structure. The variables of teacher tenure and school effectiveness reported significant effects on school managerial control processes, while instructional levels (elementary, junior, and senior) and individual school differences were not found to be significant for the construct of school managerial control.
Resumo:
This thesis presents a certification method for semantic web services compositions which aims to statically ensure its functional correctness. Certification method encompasses two dimensions of verification, termed base and functional dimensions. Base dimension concerns with the verification of application correctness of the semantic web service in the composition, i.e., to ensure that each service invocation given in the composition comply with its respective service definition. The certification of this dimension exploits the semantic compatibility between the invocation arguments and formal parameters of the semantic web service. Functional dimension aims to ensure that the composition satisfies a given specification expressed in the form of preconditions and postconditions. This dimension is formalized by a Hoare logic based calculus. Partial correctness specifications involving compositions of semantic web services can be derived from the deductive system proposed. Our work is also characterized by exploiting the use of a fragment of description logic, i.e., ALC, to express the partial correctness specifications. In order to operationalize the proposed certification method, we developed a supporting environment for defining the semantic web services compositions as well as to conduct the certification process. The certification method were experimentally evaluated by applying it in three different proof concepts. These proof concepts enabled to broadly evaluate the method certification
Resumo:
Projeto de Investigação apresentado para a obtenção do grau de Mestre em Psicologia do Desporto e do Exercício
Resumo:
Although trapped ion technology is well-suited for quantum information science, scalability of the system remains one of the main challenges. One of the challenges associated with scaling the ion trap quantum computer is the ability to individually manipulate the increasing number of qubits. Using micro-mirrors fabricated with micro-electromechanical systems (MEMS) technology, laser beams are focused on individual ions in a linear chain and steer the focal point in two dimensions. Multiple single qubit gates are demonstrated on trapped 171Yb+ qubits and the gate performance is characterized using quantum state tomography. The system features negligible crosstalk to neighboring ions (< 3e-4), and switching speeds comparable to typical single qubit gate times (< 2 us). In a separate experiment, photons scattered from the 171Yb+ ion are coupled into an optical fiber with 63% efficiency using a high numerical aperture lens (0.6 NA). The coupled photons are directed to superconducting nanowire single photon detectors (SNSPD), which provide a higher detector efficiency (69%) compared to traditional photomultiplier tubes (35%). The total system photon collection efficiency is increased from 2.2% to 3.4%, which allows for fast state detection of the qubit. For a detection beam intensity of 11 mW/cm2, the average detection time is 23.7 us with 99.885(7)% detection fidelity. The technologies demonstrated in this thesis can be integrated to form a single quantum register with all of the necessary resources to perform local gates as well as high fidelity readout and provide a photon link to other systems.
Resumo:
Modern software applications are becoming more dependent on database management systems (DBMSs). DBMSs are usually used as black boxes by software developers. For example, Object-Relational Mapping (ORM) is one of the most popular database abstraction approaches that developers use nowadays. Using ORM, objects in Object-Oriented languages are mapped to records in the database, and object manipulations are automatically translated to SQL queries. As a result of such conceptual abstraction, developers do not need deep knowledge of databases; however, all too often this abstraction leads to inefficient and incorrect database access code. Thus, this thesis proposes a series of approaches to improve the performance of database-centric software applications that are implemented using ORM. Our approaches focus on troubleshooting and detecting inefficient (i.e., performance problems) database accesses in the source code, and we rank the detected problems based on their severity. We first conduct an empirical study on the maintenance of ORM code in both open source and industrial applications. We find that ORM performance-related configurations are rarely tuned in practice, and there is a need for tools that can help improve/tune the performance of ORM-based applications. Thus, we propose approaches along two dimensions to help developers improve the performance of ORM-based applications: 1) helping developers write more performant ORM code; and 2) helping developers configure ORM configurations. To provide tooling support to developers, we first propose static analysis approaches to detect performance anti-patterns in the source code. We automatically rank the detected anti-pattern instances according to their performance impacts. Our study finds that by resolving the detected anti-patterns, the application performance can be improved by 34% on average. We then discuss our experience and lessons learned when integrating our anti-pattern detection tool into industrial practice. We hope our experience can help improve the industrial adoption of future research tools. However, as static analysis approaches are prone to false positives and lack runtime information, we also propose dynamic analysis approaches to further help developers improve the performance of their database access code. We propose automated approaches to detect redundant data access anti-patterns in the database access code, and our study finds that resolving such redundant data access anti-patterns can improve application performance by an average of 17%. Finally, we propose an automated approach to tune performance-related ORM configurations using both static and dynamic analysis. Our study shows that our approach can help improve application throughput by 27--138%. Through our case studies on real-world applications, we show that all of our proposed approaches can provide valuable support to developers and help improve application performance significantly.
Resumo:
In this article we analyze the Debate on the State of the Nation 2014. The methodology consists in coding the speeches of the prime minister, Mariano Rajoy (PP) and the then opposition leader Alfredo Perez Rubalcaba (PSOE) through extracting word clouds, branched maps and word trees that have shown the most common concepts and premises. This preliminary analysis of two dimensions, quantitative and qualitative, makes it much easier and viable subsequent discourse analysis where we focus on the different types of arguments in the communicative act: claim/solution, circumstantial premises, goal premises, value premises, meansgoal premises, alternative options/addressing alternative options.
Resumo:
The postwar development of the Intelligence Services in Japan has been based on two contrasting models: the centralized model of the USA and the collegiality of UK, neither of which has been fully developed. This has led to clashes of institutional competencies and poor anticipation of threats towards national security. This problem of opposing models has been partially overcome through two dimensions: externally through the cooperation with the US Intelligence Service under the Treaty of Mutual Cooperation and Security; and internally though the pre-eminence in the national sphere of the Department of Public Safety. However, the emergence of a new global communicative dimension requires that a communicative-viewing remodeling of this dual model is necessary due to the increasing capacity of the individual actors to determine the dynamics of international events. This article examines these challenges for the Intelligence Services of Japan and proposes a reform based on this new global communicative dimension.
Resumo:
Em oposição à situação predominante da heterogeneidade da maioria dos países africanos, cuja sociedade compreende a existência de inúmeros grupos étnicos ou diferentes religiões e culturas, Cabo Verde é definido como um Estado-Nação que reconhece uma identidade coletiva, traduzida na língua e na identificação de elementos culturais comuns pertencentes a um mesmo espaço arquipelágico. O ponto de partida para este estudo assenta na preocupação em se compreender a noção de Nação neste país, que sugere a ideia de uma sociedade onde os fatores homogéneos predominam sobre os heterogéneos ou, em último caso, que se trata de um mecanismo de coabitação entre estas duas dimensões que poderão inicialmente parecer antagónicas, mas que prestam um especial sentido ao debate acerca da identidade nacional. Ao longo deste artigo, destacaremos tantos os factos históricos, bem como o contributo dos principais movimentos culturais, sem ignorar a importância da tomada de consciência no que se refere à ideia de Nação.
Resumo:
For the current study, the authors examined the relationships among two dimensions of organizational climate and several indices of individual- and unit-level effectiveness. Specifically, the article proposes that an organization ’s service and training climate would be related to employee capabilities—operationalized in terms of frontline service capabilities and managerial support capabilities—and that such capabilities would be related to unit- level measures of employee turnover and sales growth. Using survey and operational data from 201 management and frontline staff members in 22 units of a national restaurant chain, the results from correlation and regression analyses generally supported the proposed relationships. This study replicates and extends previous research and provides a foundation for future conceptual development and empirical work in this research area.