10 resultados para blackbox


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although research on Implicit Leadership Theories (ILT) has put great effort on determining what attributes define a leader prototype, little attention has been given to understanding the relative importance of each of these attributes in the categorization process by followers. Knowing that recognition-based leadership perceptions are the result of the match between followers’ ILTs and the perceived attributes in their actual leaders, understanding how specific prototypical leader attributes impact this impression formation process is particularly relevant. In this study, we draw upon socio-cognitive theories to explore how followers cognitively process the information about a leader’s attributes. By using Conjoint Analysis (CA), a technique that allows us to measure an individual’s trade-offs when making choices about multi-attributed options, we conducted a series of 4 studies with a total of 879 participants. Our results demonstrate that attributes’ importance for individuals’ leadership perceptions formation is rather heterogeneous, and that some attributes can enhance or spoil the importance of other prototypical attributes. Finally, by manipulating the leadership domain, we show that the weighting pattern of attributes is context dependent, as suggested by the connectionist approach to leadership categorization. Our findings also demonstrate that Conjoint Analysis can be a valuable tool for ILT research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The book within which this chapter appears is published as a research reference book (not a coursework textbook) on Management Information Systems (MIS) for seniors or graduate students in Chinese universities. It is hoped that this chapter, along with the others, will be helpful to MIS scholars and PhD/Masters research students in China who seek understanding of several central Information Systems (IS) research topics and related issues. The subject of this chapter - ‘Evaluating Information Systems’ - is broad, and cannot be addressed in its entirety in any depth within a single book chapter. The chapter proceeds from the truism that organizations have limited resources and those resources need to be invested in a way that provides greatest benefit to the organization. IT expenditure represents a substantial portion of any organization’s investment budget and IT related innovations have broad organizational impacts. Evaluation of the impact of this major investment is essential to justify this expenditure both pre- and post-investment. Evaluation is also important to prioritize possible improvements. The chapter (and most of the literature reviewed herein) admittedly assumes a blackbox view of IS/IT1, emphasizing measures of its consequences (e.g. for organizational performance or the economy) or perceptions of its quality from a user perspective. This reflects the MIS emphasis – a ‘management’ emphasis rather than a software engineering emphasis2, where a software engineering emphasis might be on the technical characteristics and technical performance. Though a black-box approach limits diagnostic specificity of findings from a technical perspective, it offers many benefits. In addition to superior management information, these benefits may include economy of measurement and comparability of findings (e.g. see Part 4 on Benchmarking IS). The chapter does not purport to be a comprehensive treatment of the relevant literature. It does, however, reflect many of the more influential works, and a representative range of important writings in the area. The author has been somewhat opportunistic in Part 2, employing a single journal – The Journal of Strategic Information Systems – to derive a classification of literature in the broader domain. Nonetheless, the arguments for this approach are believed to be sound, and the value from this exercise real. The chapter drills down from the general to the specific. It commences with a highlevel overview of the general topic area. This is achieved in 2 parts: - Part 1 addressing existing research in the more comprehensive IS research outlets (e.g. MISQ, JAIS, ISR, JMIS, ICIS), and Part 2 addressing existing research in a key specialist outlet (i.e. Journal of Strategic Information Systems). Subsequently, in Part 3, the chapter narrows to focus on the sub-topic ‘Information Systems Success Measurement’; then drilling deeper to become even more focused in Part 4 on ‘Benchmarking Information Systems’. In other words, the chapter drills down from Parts 1&2 Value of IS, to Part 3 Measuring Information Systems Success, to Part 4 Benchmarking IS. While the commencing Parts (1&2) are by definition broadly relevant to the chapter topic, the subsequent, more focused Parts (3 and 4) admittedly reflect the author’s more specific interests. Thus, the three chapter foci – value of IS, measuring IS success, and benchmarking IS - are not mutually exclusive, but, rather, each subsequent focus is in most respects a sub-set of the former. Parts 1&2, ‘the Value of IS’, take a broad view, with much emphasis on ‘the business Value of IS’, or the relationship between information technology and organizational performance. Part 3, ‘Information System Success Measurement’, focuses more specifically on measures and constructs employed in empirical research into the drivers of IS success (ISS). (DeLone and McLean 1992) inventoried and rationalized disparate prior measures of ISS into 6 constructs – System Quality, Information Quality, Individual Impact, Organizational Impact, Satisfaction and Use (later suggesting a 7th construct – Service Quality (DeLone and McLean 2003)). These 6 constructs have been used extensively, individually or in some combination, as the dependent variable in research seeking to better understand the important antecedents or drivers of IS Success. Part 3 reviews this body of work. Part 4, ‘Benchmarking Information Systems’, drills deeper again, focusing more specifically on a measure of the IS that can be used as a ‘benchmark’3. This section consolidates and extends the work of the author and his colleagues4 to derive a robust, validated IS-Impact measurement model for benchmarking contemporary Information Systems (IS). Though IS-Impact, like ISS, has potential value in empirical, causal research, its design and validation has emphasized its role and value as a comparator; a measure that is simple, robust and generalizable and which yields results that are as far as possible comparable across time, across stakeholders, and across differing systems and systems contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Once known as Crabb’s Creek, Katarapko Creek is a small anabranch of the Murray River, located between the towns of Berri and Loxton in the Riverland region of South Australia. Its 9 000 hectare grey clay floodplain is covered with blackbox, saltbush and lignum. The creek’s horseshoe lagoons, marshes and islands are the traditional lands of the Meru peoples. They fished the creek and surrounding waterways and hunted the wetlands. The ebb and flow of water guided their travels and featured in their stories. The Meru have seen their land and the river change...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a scalable, statistical ‘black-box’ model for predicting the performance of parallel programs on multi-core non-uniform memory access (NUMA) systems. We derive a model with low overhead, by reducing data collection and model training time. The model can accurately predict the behaviour of parallel applications in response to changes in their concurrency, thread layout on NUMA nodes, and core voltage and frequency. We present a framework that applies the model to achieve significant energy and energy-delay-square (ED2) savings (9% and 25%, respectively) along with performance improvement (10% mean) on an actual 16-core NUMA system running realistic application workloads. Our prediction model proves substantially more accurate than previous efforts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuous casting is a casting process that produces steel slabs in a continuous manner with steel being poured at the top of the caster and a steel strand emerging from the mould below. Molten steel is transferred from the AOD converter to the caster using a ladle. The ladle is designed to be strong and insulated. Complete insulation is never achieved. Some of the heat is lost to the refractories by convection and conduction. Heat losses by radiation also occur. It is important to know the temperature of the melt during the process. For this reason, an online model was previously developed to simulate the steel and ladle wall temperatures during the ladle cycle. The model was developed as an ODE based model using grey box modeling technique. The model’s performance was acceptable and needed to be presented in a user friendly way. The aim of this thesis work was basically to design a GUI that presents steel and ladle wall temperatures calculated by the model and also allow the user to make adjustments to the model. This thesis work also discusses the sensitivity analysis of different parameters involved and their effects on different temperature estimations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Splenomegaly, albeit variably, is a hallmark of malaria; yet, the role of the spleen in Plasmodium infections remains vastly unknown. The implementation of imaging to study the spleen is rapidly advancing our knowledge of this so-called "blackbox" of the abdominal cavity. Not only has ex vivo imaging revealed the complex functional compartmentalization of the organ and immune effector cells, but it has also allowed the observation of major structural remodeling during infections. In vivo imaging, on the other hand, has allowed quantitative measurements of the dynamic passage of the parasite at spatial and temporal resolution. Here, we review imaging techniques used for studying the malarious spleen, from optical microscopy to in vivo imaging, and discuss the bright perspectives of evolving technologies in our present understanding of the role of this organ in infections caused by Plasmodium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho propõe dois métodos para teste de sistemas de software: o primeiro extrai ideias de teste de um modelo desenvolvido em rede de Petri hierárquica e o segundo valida os resultados após a realização dos testes utilizando um modelo em OWL-S. Estes processos aumentam a qualidade do sistema desenvolvido ao reduzir o risco de uma cobertura insuficiente ou teste incompleto de uma funcionalidade. A primeira técnica apresentada consiste de cinco etapas: i) avaliação do sistema e identificação dos módulos e entidades separáveis, ii) levantamento dos estados e transições, iii) modelagem do sistema (bottom-up), iv) validação do modelo criado avaliando o fluxo de cada funcionalidade e v) extração dos casos de teste usando uma das três coberturas de teste apresentada. O segundo método deve ser aplicado após a realização dos testes e possui cinco passos: i) primeiro constrói-se um modelo em OWL (Web Ontology Language) do sistema contendo todas as informações significativas sobre as regras de negócio da aplicação, identificando as classes, propriedades e axiomas que o regem; ii) em seguida o status inicial antes da execução é representado no modelo através da inserção das instâncias (indivíduos) presentes; iii) após a execução dos casos de testes, a situação do modelo deve ser atualizada inserindo (sem apagar as instâncias já existentes) as instâncias que representam a nova situação da aplicação; iv) próximo passo consiste em utilizar um reasoner para fazer as inferências do modelo OWL verificando se o modelo mantém a consistência, ou seja, se não existem erros na aplicação; v) finalmente, as instâncias do status inicial são comparadas com as instâncias do status final, verificando se os elementos foram alterados, criados ou apagados corretamente. O processo proposto é indicado principalmente para testes funcionais de caixa-preta, mas pode ser facilmente adaptado para testes em caixa branca. Obtiveram-se casos de testes semelhantes aos que seriam obtidos em uma análise manual mantendo a mesma cobertura do sistema. A validação provou-se condizente com os resultados esperados, bem como o modelo ontológico mostrouse bem fácil e intuitivo para aplicar manutenções.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis contributes to social studies of finance and accounting (Vollmer, Mennicken, & Preda, 2009) and the practice theory literatures (Feldman & Orlikowski, 2011) by experimenting (Baxter & Chua, 2008) with concepts developed by Theodore Schatzki and demonstrating their relevance and usefulness in theorizing and explaining accounting and other organizational phenomena. Influenced by Schatzki, I have undertaken a sociological investigation of the practices, arrangements, and nexuses forming (part of) the social ‘site’ of private equity (PE). I have examined and explained the organization of practices within the PE industry. More specifically, I have sought to throw light on the practice organizations animating various PE practices. I have problematized a particular aspect of Schatzki’s practice organization framework: ‘general understanding’, which has so far been poorly understood and taken for granted in the accounting literature. I have tried to further explore the concept to clarify important definitional issues surrounding its empirical application. In investigating the forms of accounting and control practices in PE firms and how they link with other practices forming part of the ‘site’, I have sought to explain how the ‘situated functionality’ of accounting is ‘prefigured’ by its ‘dispersed’ nature. In doing so, this thesis addresses the recent calls for research on accounting and control practices within financial services firms. This thesis contributes to the social studies of finance and accounting literature also by opening the blackbox of investment [e]valuation practices prevalent in the PE industry. I theorize the due diligence of PE funds as a complex of linked calculative practices and bring to fore the important aspects of ‘practical intelligibility’ of the investment professionals undertaking investment evaluation. I also identify and differentiate the ‘causal’ and ‘prefigurational’ relations between investment evaluation practices and the material entities ‘constituting’ those practices. Moreover, I demonstrate the role of practice memory in those practices. Finally, the thesis also contributes to the practice theory literature by identifying and attempting to clarify and/or improve the poorly defined and/or underdeveloped concepts of Schatzki’s ‘site’ ontology framework.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent papers, Wied and his coauthors have introduced change-point procedures to detect and estimate structural breaks in the correlation between time series. To prove the asymptotic distribution of the test statistic and stopping time as well as the change-point estimation rate, they use an extended functional Delta method and assume nearly constant expectations and variances of the time series. In this thesis, we allow asymptotically infinitely many structural breaks in the means and variances of the time series. For this setting, we present test statistics and stopping times which are used to determine whether or not the correlation between two time series is and stays constant, respectively. Additionally, we consider estimates for change-points in the correlations. The employed nonparametric statistics depend on the means and variances. These (nuisance) parameters are replaced by estimates in the course of this thesis. We avoid assuming a fixed form of these estimates but rather we use "blackbox" estimates, i.e. we derive results under assumptions that these estimates fulfill. These results are supplement with examples. This thesis is organized in seven sections. In Section 1, we motivate the issue and present the mathematical model. In Section 2, we consider a posteriori and sequential testing procedures, and investigate convergence rates for change-point estimation, always assuming that the means and the variances of the time series are known. In the following sections, the assumptions of known means and variances are relaxed. In Section 3, we present the assumptions for the mean and variance estimates that we will use for the mean in Section 4, for the variance in Section 5, and for both parameters in Section 6. Finally, in Section 7, a simulation study illustrates the finite sample behaviors of some testing procedures and estimates.