970 resultados para Research paradigms
Resumo:
In recent years, research aimed at identifying and relating the antecedents and consequences of diffusing organizational practices/ideas has turned its attention to debating the international adoption and implementation of the Anglo-American model of corporate governance, i.e., a shareholder-value-orientation (SVO). While financial economists characterize the adoption of an SVO as necessary and performance-enhancing, behavioral scientists have disputed such claims, invoking institutional contingencies in the appropriateness of an SVO. Our study seeks to provide some resolution to the debate by developing an overarching socio-political perspective that links the antecedents and consequences of the adoption of the contested practice of SVO. We test our framework using extensive longitudinal data from 1992-2006 from the largest listed corporations in the Netherlands, and we find a negative relationship between SVO adoption and subsequent firm performance, although this effect is attenuated when accompanied by greater SVO-alignment among major owners and a firm’s visible commitment to an SVO. This study extends prior research on the diffusion of contested organizational practices that has taken a socio-political perspective by offering an original contingency perspective that addresses how and why the misaligned preferences of corporate owners will affect (i) a company’s inclination to espouse an SVO, and (ii) the performance consequences of such misalignment.This study suggests when board members are considering the adoption of new ideas/practices (e.g., SVO), they should consider the contextual fitness of the idea/practice with the firm’s owners and their interests.
Resumo:
In this invited paper I describe some personal views on the research field of conceptual modelling. I argue that the field has become entrenched in some “bad habits” that usually emerge in evolved paradigms and that we need to proactively pursue a dual research strategy incorporating new and different avenues that lead us to novel and impactful research contexts of conceptual modelling. I provide a framework that can guide this exploration and finish with some recommendations about how conceptual modelling research programs could proceed.
Resumo:
'Pars pro toto: Experimental Exhibition Design and Curatorial Paradigms' is situated within the ongoing debate over the conflation of art and curating, and the subsequent tension between artistic autonomy and curatorial intervention. This practice-led research project acclimates these polarities using a collaborative and discursive curatorial methodology in the creation of two exhibitions. Both exhibitions, one digital and one primarily physical, investigated how the temporary exhibition can operate as a site for provocation, how the suggested methodology facilitates the relationship between artist and curator within this paradigm, and outlines factors that assist in expanding the definition of the contemporary curatorial role.
Resumo:
All sound research commence with the selection of a research paradigm. The chosen research paradigm is significant in shaping the researcher’s perspectives of the world and it is a vital step in any study’s’ research design. There are different paradigms that IS researchers can choose from; amongst which the interpretive paradigm is growing in acceptance.. Though interpretive research has emerged as an important strand in Information Systems (IS), guidelines on how to conduct interpretive research and how to evaluate them have been scarce. Klein and Myers presented seven principles with examples for each from three case examples. While these principles are much valued, there is a lack of support for novice researchers on how to embed these principles in an overall research design, which could help with the aid of a detailed example that has done so. Thus, this paper aims to address this gap, and presents how Klein and Myers’s principles were applied within an example study that investigated shared services in the Malaysian Higher Education context. The example study adopted the interpretive paradigm as the most suited approach that fitted their research questions and goals. More details about the selection and adoption of the Klein and Myers’s guidelines in the context of the shared services research case study are presented in the paper.
Resumo:
Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.
Resumo:
Code Division Multiple Access (CDMA) techniques, by far, had been applied to LAN problems by many investigators, An analytical study of well known algorithms for generation of Orthogonal codes used in FO-CDMA systems like those for prime, quasi-Prime, Optical Orthogonal and Matrix codes has been presented, Algorithms for OOCs like Greedy/Modified Greedy/Accelerated Greedy algorithms are implemented. Many speed-up enhancements. for these algorithms are suggested. A novel Synthetic Algorithm based on Difference Sets (SADS) is also proposed. Investigations are made to vectorise/parallelise SADS to implement the source code on parallel machines. A new matrix for code families of OOCs with different seed code-words but having the same (n,w,lambda) set is formulated.
Resumo:
Fusion of multi-sensor imaging data enables a synergetic interpretation of complementary information obtained by sensors of different spectral ranges. Multi-sensor data of diverse spectral, spatial and temporal resolutions require advanced numerical techniques for analysis and interpretation. This paper reviews ten advanced pixel based image fusion techniques – Component substitution (COS), Local mean and variance matching, Modified IHS (Intensity Hue Saturation), Fast Fourier Transformed-enhanced IHS, Laplacian Pyramid, Local regression, Smoothing filter (SF), Sparkle, SVHC and Synthetic Variable Ratio. The above techniques were tested on IKONOS data (Panchromatic band at 1 m spatial resolution and Multispectral 4 bands at 4 m spatial resolution). Evaluation of the fused results through various accuracy measures, revealed that SF and COS methods produce images closest to corresponding multi-sensor would observe at the highest resolution level (1 m).
Resumo:
Grattan, J. Pollution and paradigms: Lessons from Icelandic volcanism for continental flood basalt studies. Lithos. 2005. 79 pp 343-353
Resumo:
Semiconductor nanowires are pseudo 1-D structures where the magnitude of the semiconducting material is confined to a length of less than 100 nm in two dimensions. Semiconductor nanowires have a vast range of potential applications, including electronic (logic devices, diodes), photonic (laser, photodetector), biological (sensors, drug delivery), energy (batteries, solar cells, thermoelectric generators), and magnetic (spintronic, memory) devices. Semiconductor nanowires can be fabricated by a range of methods which can be categorised into one of two paradigms, bottom-up or top-down. Bottom-up processes can be defined as those where structures are assembled from their sub-components in an additive fashion. Top-down fabrication strategies use sculpting or etching to carve structures from a larger piece of material in a subtractive fashion. This seminar will detail a number of novel routes to fabricate semiconductor nanowires by both bottom-up and top-down paradigms. Firstly, a novel bottom-up route to fabricate Ge nanowires with controlled diameter distributions in the sub-20 nm regime will be described. This route details nanowire synthesis and diameter control in the absence of a foreign seed metal catalyst. Additionally a top-down route to nanowire array fabrication will be detailed outlining the importance of surface chemistry in high-resolution electron beam lithography (EBL) using hydrogen silsesquioxane (HSQ) on Ge and Bi2Se3 surfaces. Finally, a process will be described for the directed self-assembly of a diblock copolymer (PS-b-PDMS) using an EBL defined template. This section will also detail a route toward selective template sidewall wetting of either block in the PS-b-PDMS system, through tailored functionalisation of the template and substrate surfaces.
Resumo:
Periodic visual stimulation and analysis of the resulting steady-state visual evoked potentials were first introduced over 80 years ago as a means to study visual sensation and perception. From the first single-channel recording of responses to modulated light to the present use of sophisticated digital displays composed of complex visual stimuli and high-density recording arrays, steady-state methods have been applied in a broad range of scientific and applied settings.The purpose of this article is to describe the fundamental stimulation paradigms for steady-state visual evoked potentials and to illustrate these principles through research findings across a range of applications in vision science.
Resumo:
Three paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation -- a nonlinear, structured-grid partial differential equation boundary value problem -- using the same algorithm on the same hardware. All of the paradigms -- parallel languages represented by the Portland Group's HPF, (semi-)automated serial-to-parallel source-to-source translation represented by CAP-Tools from the University of Greenwich, and parallel libraries represented by Argonne's PETSc -- are found to be easy to use for this problem class, and all are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under any paradigm includes specification of the data partitioning, corresponding to a geometrically simple decomposition of the domain of the PDE. Programming in SPMD style for the PETSc library requires writing only the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global-to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm as a starting point, introduction of concurrency through subdomain blocking (a task similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Programming with CAPTools involves feeding the same sequential implementation to the CAPTools interactive parallelization system, and guiding the source-to-source code transformation by responding to various queries about quantities knowable only at runtime. Results representative of "the state of the practice" for a scaled sequence of structured grid problems are given on three of the most important contemporary high-performance platforms: the IBM SP, the SGI Origin 2000, and the CRAYY T3E.
Resumo:
Thomas Kuhn’s concept of a normal science paradigm has been utilised and criticised across a range of social science fields. However, Kuhn’s aim was to argue that science progresses not in an incremental manner but through a series of paradigms that need a revolution in thought to shift from one to the next. This paper addresses Kuhn’s work focusing on the totality of his model, but recognising the ambiguities concerning paradigm shifts that have led to charges of relativism. To address this weakness an argument is advanced for a political economy analysis of the publication process and the development of critical accounting research centred on human emancipation. The paper concludes with some suggested research agendas particularly relevant to the Irish context.
Resumo:
The concept of space entered architectural history as late as 1893. Studies in art opened up the discussion, and it has been studied in various ways in architecture ever since. This article aims to instigate an additional reading to architectural history, one that is not supported by "isms" but based on space theories in the 20th century. Objectives of the article are to bring the concept of space and its changing paradigms to the attention of architectural researchers, to introduce a conceptual framework to classify and clarify theories of space, and to enrich the discussions on the 20th century architecture through theories that are beyond styles. The introduction of space in architecture will revolve around subject-object relationships, three-dimensionality and senses. Modern space will be discussed through concepts such as empathy, perception, abstraction, and geometry. A scientific approach will follow to study the concept of place through environment, event, behavior, and design methods. Finally, the research will look at contemporary approaches related to digitally supported space via concepts like reality-virtuality, mediated experience, and relationship with machines.
Resumo:
Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation.
Resumo:
Despite an abundance of studies on hybridization and hybrid forms of organizing, scholarly work has failed to distinguish consistently between specific types of hybridity. As a consequence, the analytical category has become blurred and lacks conceptual clarity. Our paper discusses hybridity as the simultaneous appearance of institutional logics in organizational contexts, and differentiates the parallel co-existence of logics from transitional combinations (eventually leading to the replacement of a logic) and more robust combinations in the form of layering and blending. While blending refers to hybridity as an ‘amalgamate’ with original components that are no longer discernible, the notion of layering conceptualizes hybridity in a way that the various elements, or clusters thereof, are added on top of, or alongside, each other, similar to sediment layers in geology. We illustrate and substantiate such conceptual differentiation with an empirical study of the dynamics of public sector reform. In more detail, we examine the parliamentary discourse around two major reforms of the Austrian Federal Budget Law in 1986 and in 2007/2009 in order to trace administrative (reform) paradigms. Each of the three identified paradigms manifests a specific field-level logic with implications for the state and its administration: bureaucracy in Weberian-style Public Administration, market-capitalism in New Public Management, and democracy in New Public Governance. We find no indication of a parallel co-existence or transitional combination of logics, but hybridity in the form of robust combinations. We explore how new ideas fundamentally build on – and are made resonant with – the central bureaucratic logic in a way that suggests layering rather than blending. The conceptual findings presented in our article have implications for the literature on institutional analysis and institutional hybridity.