452 resultados para set theory
Resumo:
The Theory of the Growth of The Firm by Edith Penrose, first published in 1959, is a seminal contribution to the field of management. Penrose's intention was to create a theory of firm growth which was logically consistent and empirically tractable (Buckley and Casson, 2007). Much attention, however, has been focused on her unintended contribution to the resource-based view (henceforth RBV) (e.g. Kor and Mahoney, 2004; Lockett and Thompson, 2004) rather than her firm growth theory. We feel that this is unfortunate because despite a rapidly growing body of empirical work, conceptual advancement in growth studies has been limited (Davidsson and Wiklund, 2000; Davidsson et ai., 2006; Delmar, 1997; Storey, 1994). The growth literature frequently references Penrose's work, but little explicit testing of her ideas has been undertaken. This is surprising given that Penrose's work remains the most comprehensive theory of growth to date. One explanation is that she did not formality present her arguments, favouring verbal exposition over formalized models (Lockett, 2005; Lockett and Thompson, 2004). However, the central propositions and conclusions of her theory can be operationalized and empirically tested.
Resumo:
A value-shift began to influence global political thinking in the late 20th century, characterised by recognition of the need for environmentally, socially and culturally sustainable resource development. This shift entailed a move away from thinking of ‘nature’ and ‘culture’ as separate entities – the former existing to serve the latter – toward the possibility of embracing the intrinsic worth of the nonhuman world. Cultural landscape theory recognises ‘nature’ as at once both ‘natural’, and a ‘cultural’ construct. As such, it may offer a framework through which to progress in the quest for ‘sustainable development’. This study makes a contribution to this quest by asking whether contemporary developments in cultural landscape theory can contribute to rehabilitation strategies for Australian open-cut coal mining landscapes. The answer is ‘yes’. To answer the research question, a flexible, ‘emergent’ methodological approach has been used, resulting in the following outcomes. A thematic historical overview of landscape values and resource development in Australia post-1788, and a review of cultural landscape theory literature, contribute to the formation of a new theoretical framework: Reconnecting the Interrupted Landscape. This framework establishes a positive answer to the research question. It also suggests a method of application within the Australian open-cut coal mining landscape, a highly visible exemplar of the resource development landscape. This method is speculatively tested against the rehabilitation strategy of an operating open-cut coal mine, concluding with positive recommendations to the industry, and to government.
Resumo:
Research into hyperinsulinemic laminitis has progressed significantly in recent years with the use of the prolonged-euglycemic, hyperinsulinemic clamp (p-EHC). Previous investigations of laminitis pathophysiology have focused on digital vascular dysfunction, inflammation, altered glucose metabolism within the lamellae, and lamellar basement membrane breakdown by metalloproteinases. The etiopathogenesis of laminitis occurring in association with hyperinsulinemia is yet to be fully characterized, but it may not involve these mechanisms. Insulin stimulates cellular proliferation and can also affect other body systems, such as the insulin-like growth factor (IGF) system. Insulin-like growth factor-1 (IGF-1) is structurally homologous to insulin and, like insulin, binds with strong affinity to a specific tyrosine kinase receptor on the cell surface to produce its effects, which include promoting cell proliferation. Receptors for IGF-1 (IGF-1R) are present in the lamellar epidermis. An alternative theory for the pathogenesis of hyperinsulinemic laminitis is that uncontrolled cell proliferation, mediated through both the insulin receptor (InsR) and IGF-1R, leads to lengthening, weakening, and failure of the lamellae. An analysis of the proliferative activity of lamellar epidermal cells during the developmental and acute phases of hyperinsulinemic laminitis, and lamellar gene expression of the InsR and IGF-1R was undertaken.
Resumo:
Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.
Resumo:
We report a comprehensive theoretical study on reaction of methane by Fe4 cluster. This Letter gains insight into the mechanism of the reaction and indicate the Fe4 cluster has strong catalytic effect on the activation reaction of methane. In detail, the results show the cleavage of the first C–H bond is both an energetically and kinetically favourable process and the breaking of the second C–H is the rate-determining step. Moreover, our Letter demonstrates that the different cluster size of iron can not only determine the catalytic activity of methane but also control the product selectivity.
Resumo:
The generation of a correlation matrix from a large set of long gene sequences is a common requirement in many bioinformatics problems such as phylogenetic analysis. The generation is not only computationally intensive but also requires significant memory resources as, typically, few gene sequences can be simultaneously stored in primary memory. The standard practice in such computation is to use frequent input/output (I/O) operations. Therefore, minimizing the number of these operations will yield much faster run-times. This paper develops an approach for the faster and scalable computing of large-size correlation matrices through the full use of available memory and a reduced number of I/O operations. The approach is scalable in the sense that the same algorithms can be executed on different computing platforms with different amounts of memory and can be applied to different problems with different correlation matrix sizes. The significant performance improvement of the approach over the existing approaches is demonstrated through benchmark examples.
Resumo:
Creative productivity emerges from human interactions (Hartley, 2009, p. 214). In an era when life is lived in rather than with media (Deuze, this issue), this productivity is widely distributed among ephemeral social networks mediated through the internet. Understanding the underlying dynamics of these networks of human interaction is an exciting and challenging task that requires us to come up with new ways of thinking and theorizing. For example, inducting theory from case studies that are designed to show the exceptional dynamics present within single settings can be augmented today by largescale data generation and collections that provide new analytic opportunities to research the diversity and complexity of human interaction. Large-scale data generation and collection is occurring across a wide range of individuals and organisations. This offers a massive field of analysis which internet companies and research labs in particular are keen on exploring. Lazer et al (2009: 721) argue that such analytic potential is transformational for many if not most research fields but that the use of such valuable data must neither remain confined to private companies and government agencies nor to a privileged set of academic researchers whose studies cannot be replicated nor critiqued. In fact, the analytic capacity to have data of such unprecedented scope and scale available not only requires us to analyse what is and could be done with it and by whom (1) but also what it is doing to us, our cultures and societies (2). Part (1) of such analysis is interested in dependencies and their implications. Part (2) of the enquiry embeds part (1) in a larger context that analyses the long-term, complex dynamics of networked human interaction. From the latter perspective we can treat specific phenomena and the methods used to analyse them as moments of evolution.
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
The practical number of charge carriers loaded is crucial to the evaluation of the capacity performance of carbon-based electrodes in service, and cannot be easily addressed experimentally. In this paper, we report a density functional theory study of charge carrier adsorption onto zigzag edge-shaped graphene nanoribbons (ZGNRs), both pristine and incorporating edge substitution with boron, nitrogen or oxygen atoms. All edge substitutions are found to be energetically favorable, especially in oxidized environments. The maximal loading of protons onto the substituted ZGNR edges obeys a rule of [8-n-1], where n is the number of valence electrons of the edge-site atom constituting the adsorption site. Hence, a maximum charge loading is achieved with boron substitution. This result correlates in a transparent manner with the electronic structure characteristics of the edge atom. The boron edge atom, characterized by the most empty p band, facilitates more than the other substitutional cases the accommodation of valence electrons transferred from the ribbon, induced by adsorption of protons. This result not only further confirms the possibility of enhancing charge storage performance of carbon-based electrochemical devices through chemical functionalization but also, more importantly, provides the physical rationale for further design strategies.
Resumo:
Heteroatom doping on the edge of graphene may serve as an effective way to tune chemical activity of carbon-based electrodes with respect to charge carrier transfer in an aqueous environment. In a step towards developing mechanistic understanding of this phenomenon, we explore herein mechanisms of proton transfer from aqueous solution to pristine and doped graphene edges utilizing density functional theory. Atomic B-, N-, and O- doped edges as well as the native graphene are examined, displaying varying proton affinities and effective interaction ranges with the H3O+ charge carrier. Our study shows that the doped edges characterized by more dispersive orbitals, namely boron and nitrogen, demonstrate more energetically favourable charge carrier exchange compared with oxygen, which features more localized orbitals. Extended calculations are carried out to examine proton transfer from the hydronium ion in the presence of explicit water, with results indicating that the basic mechanistic features of the simpler model are unchanged.
Resumo:
Educational reforms currently being enacted in Kuwaiti Family and Consumer Sciences (FCS) in response to contemporary demands for increased student-centred teaching and learning are challenging for FCS teachers due to their limited experience with student-centred learning tools such as Graphic Organisers (GOs). To adopt these reforms, Kuwaiti teachers require a better understanding of and competency in promoting cognitive learning processes that will maximise student-centred learning approaches. This study followed the experiences of four Grade 6 FCS Kuwaiti teachers as they undertook a Professional Development (PD) program specifically designed to advance their understanding of the use of GOs and then as they implemented what they had learned in their Grade 6 FCS classroom. The PD program developed for this study was informed by Nasseh.s competency PD model as well as Piaget and Ausubel.s cognitive theories. This model enabled an assessment and evaluation of the development of the teachers. competencies as an outcome of the PD program in terms of the adoption of GOs, in particular, and their capacity to use GOs to engage students in personalised, in-depth, learning through critical thinking and understanding. The research revealed that the PD program was influential in reforming the teachers. learning, understanding of and competency in, cognitive and visual theories of learning, so that they facilitated student-centred teaching and learning processes that enabled students to adopt and adapt GOs in constructivist learning. The implementation of five GOs - Flow Chart, Concept Maps, K-W-L Chart, Fishbone Diagram and Venn Diagram - as learning tools in classrooms was investigated to find if changes in pedagogical approach for supporting conceptual learning through cognitive information processing would reduce the cognitive work load of students and produce better learning approaches. The study as evidenced by the participant teachers. responses and classroom observations, showed a marked increase in student interest, participation, critical thought, problem solving skills, as a result of using GOs, compared to using traditional teaching and learning methods. A theoretical model was developed from the study based on the premise that teachers. knowledge of the subject, pedagogy and student learning precede the implementation of student-centred learning reform, that it plays an important role in the implementation of student-centred learning and that it brings about a change in teaching practice. The model affirmed that observed change in teaching-practice included aspects of teachers. beliefs, as well as confidence and effect on workplace and on student learning, including engagement, understanding, critical thinking and problem solving. The model assumed that change in teaching practice is inseparable from teachers. lifelong PD needs related to knowledge, understanding, skills and competency. These findings produced a set of preliminary guidelines for establishing student-centred constructivist strategies in Kuwaiti education while retaining Kuwait.s cultural uniqueness.
Resumo:
This thematic issue on education and the politics of becoming focuses on how a Multiple Literacies Theory (MLT) plugs into practice in education. MLT does this by creating an assemblage between discourse, text, resonance and sensations. What does this produce? Becoming AND how one might live are the product of an assemblage (May, 2005; Semetsky, 2003). In this paper, MLT is the approach that explores the connection between educational theory and practice through the lens of an empirical study of multilingual children acquiring multiple writing systems simultaneously. The introduction explicates discourse, text, resonance, sensation and becoming. The second section introduces certain Deleuzian concepts that plug into MLT. The third section serves as an introduction to MLT. The fourth section is devoted to the study by way of a rhizoanalysis. Finally, drawing on the concept of the rhizome, this article exits with potential lines of flight opened by MLT. These are becomings which highlight the significance of this work in terms of transforming not only how literacies are conceptualized, especially in minority language contexts, but also how one might live.
Resumo:
This project was a step forward in developing and evaluating a novel, mathematical model that can deduce the meaning of words based on their use in language. This model can be applied to a wide range of natural language applications, including the information seeking process most of us undertake on a daily basis.
Resumo:
Boards of directors are key governancemechanisms in organizations and fulfill twomain tasks:monitoringmanagers and firm performance, and providing advice and access to resources. In spite of a wealth of researchmuch remains unknown about how boards attend to the two tasks. This study investigates whether organizational (firm profitability) and environmental factors (industry regulation) affect board task performance. The data combine CEOs' responses to a questionnaire, and archival data from a sample of large Italian firms. Findings show that past firm performance is negatively associatedwith board monitoring and advice tasks; greater industry regulation enhances perceived board task performance; board monitoring and advice tasks tend to reinforce each other, despite their theoretical and practical distinction.
Resumo:
This research develops a new framework to be used as a tool for analysing and designing walkable communities. The literature review recognises the work of other researchers combining their findings with the theory of activity nodes and considers how a framework may be used on a more global basis. The methodology develops a set of criteria through the analysis of noted successful case studies and this is then tested against an area with very low walking rates in Brisbane, Australia. Results of the study suggest that as well as the accepted criteria of connectivity, accessibility, safety, security, and path quality further criteria in the form or planning hierarchy, activity nodes and climate mitigation could be added to allow the framework to cover a broader context. Of particular note is the development of the nodal approach, which allows simple and effective analysis of existing conditions, and may also prove effective as a tool for planning and design of walkable communities.