842 resultados para normal theory methods
Resumo:
This book gives a general view of sequence analysis, the statistical study of successions of states or events. It includes innovative contributions on life course studies, transitions into and out of employment, contemporaneous and historical careers, and political trajectories. The approach presented in this book is now central to the life-course perspective and the study of social processes more generally. This volume promotes the dialogue between approaches to sequence analysis that developed separately, within traditions contrasted in space and disciplines. It includes the latest developments in sequential concepts, coding, atypical datasets and time patterns, optimal matching and alternative algorithms, survey optimization, and visualization. Field studies include original sequential material related to parenting in 19th-century Belgium, higher education and work in Finland and Italy, family formation before and after German reunification, French Jews persecuted in occupied France, long-term trends in electoral participation, and regime democratization. Overall the book reassesses the classical uses of sequences and it promotes new ways of collecting, formatting, representing and processing them. The introduction provides basic sequential concepts and tools, as well as a history of the method. Chapters are presented in a way that is both accessible to the beginner and informative to the expert.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A land classification method was designed for the Community of Madrid (CM), which has lands suitable for either agriculture use or natural spaces. The process started from an extensive previous CM study that contains sets of land attributes with data for 122 types and a minimum-requirements method providing a land quality classification (SQ) for each land. Borrowing some tools from Operations Research (OR) and from Decision Science, that SQ has been complemented by an additive valuation method that involves a more restricted set of 13 representative attributes analysed using Attribute Valuation Functions to obtain a quality index, QI, and by an original composite method that uses a fuzzy set procedure to obtain a combined quality index, CQI, that contains relevant information from both the SQ and the QI methods.
Resumo:
This article considers the place of qualitative research in psychoanalysis and child psychotherapy. It discusses why research methodology for many years occupied so small a place in these fields, and examines the cultural and social developments since the 1960s which have changed this situation, giving formal methods of research much greater significance. It reflects on the different pressures to develop formal research methods which arise both from outside the psychoanalytic field, as a condition of its continued professional survival, and from within it, where its main aim is the development of fundamental psychoanalytic knowledge, It suggests that the conduct of mainly quantitative research into treatment outcomes is largely a response to these external pressures, whilst the main benefits to be gained from the development of qualitative research methods, such as Grounded Theory, are in facilitating the knowledge-generating capacities and achievements of child psychotherapists themselves. The paper describes Grounded Theory methods, and explains how they can be valuable in the recognition of hitherto unrecognised meanings and patterns as these are made visible in clinical practice. Finally, it briefly describes five different examples of completed doctoral studies, all of which have added significantly to the knowledge-base of child psychotherapy, and which demonstrate how much can be accomplished using this method of research.
Resumo:
Book review: Organizations in Time, edited by R Daniel Wadhwani and Marcelo Bucheli, Oxford University Press, 2014. The title of this edited volume is slightly misleading, as its various contributions explore the potential for more historical analysis in organization studies rather than addressing issues associated with time and organizing. Hopefully this will not distract from the important achievement of this volume—important especially for business historians—in further expanding and integrating business history into management and organization studies. The various contributions, elegantly tied together by R. Daniel Wadhwani and Marcelo Bucheli in their substantial introduction (which, by the way, presents a significant contribution in its own right), opens up new sets of questions, especially in terms of future methodological and theoretical developments in the field. This book also reflects the changing institutional location of business historians, who increasingly make their careers in business schools rather than history departments, especially in Europe, reopening old questions of history as a social science. There have been several calls to teach more history in business education, such as the Carnegie Foundation report (2011) that found undergraduate business education too narrow in focus and highlighted the need to integrate more liberal arts teaching into the curriculum. However, in the contemporary research-driven environment of business and management schools, historical understanding is unlikely to permeate the curriculum if historical analysis cannot first deliver significant theoretical contributions. This is the central theme around which this edited volume revolves, and it marks a milestone in this ongoing debate. (In the spirit of full disclosure, I should add that even though I did not contribute to this volume, I have coauthored with several of its contributors and view this book as central to my current research practice.)
Resumo:
We consider the application of normal theory methods to the estimation and testing of a general type of multivariate regressionmodels with errors--in--variables, in the case where various data setsare merged into a single analysis and the observable variables deviatepossibly from normality. The various samples to be merged can differ on the set of observable variables available. We show that there is a convenient way to parameterize the model so that, despite the possiblenon--normality of the data, normal--theory methods yield correct inferencesfor the parameters of interest and for the goodness--of--fit test. Thetheory described encompasses both the functional and structural modelcases, and can be implemented using standard software for structuralequations models, such as LISREL, EQS, LISCOMP, among others. An illustration with Monte Carlo data is presented.
Resumo:
A new cryptographic hash function Whirlwind is presented. We give the full specification and explain the design rationale. We show how the hash function can be implemented efficiently in software and give first performance numbers. A detailed analysis of the security against state-of-the-art cryptanalysis methods is also provided. In comparison to the algorithms submitted to the SHA-3 competition, Whirlwind takes recent developments in cryptanalysis into account by design. Even though software performance is not outstanding, it compares favourably with the 512-bit versions of SHA-3 candidates such as LANE or the original CubeHash proposal and is about on par with ECHO and MD6.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.
Resumo:
The paper presents a theory for modeling flow in anisotropic, viscous rock. This theory has originally been developed for the simulation of large deformation processes including the folding and kinking of multi-layered visco-elastic rock (Muhlhaus et al. [1,2]). The orientation of slip planes in the context of crystallographic slip is determined by the normal vector - the director - of these surfaces. The model is applied to simulate anisotropic mantle convection. We compare the evolution of flow patterns, Nusselt number and director orientations for isotropic and anisotropic rheologies. In the simulations we utilize two different finite element methodologies: The Lagrangian Integration Point Method Moresi et al [8] and an Eulerian formulation, which we implemented into the finite element based pde solver Fastflo (www.cmis.csiro.au/Fastflo/). The reason for utilizing two different finite element codes was firstly to study the influence of an anisotropic power law rheology which currently is not implemented into the Lagrangian Integration point scheme [8] and secondly to study the numerical performance of Eulerian (Fastflo)- and Lagrangian integration schemes [8]. It turned out that whereas in the Lagrangian method the Nusselt number vs time plot reached only a quasi steady state where the Nusselt number oscillates around a steady state value the Eulerian scheme reaches exact steady states and produces a high degree of alignment (director orientation locally orthogonal to velocity vector almost everywhere in the computational domain). In the simulations emergent anisotropy was strongest in terms of modulus contrast in the up and down-welling plumes. Mechanisms for anisotropic material behavior in the mantle dynamics context are discussed by Christensen [3]. The dominant mineral phases in the mantle generally do not exhibit strong elastic anisotropy but they still may be oriented by the convective flow. Thus viscous anisotropy (the main focus of this paper) may or may not correlate with elastic or seismic anisotropy.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
This note is motivated from some recent papers treating the problem of the existence of a solution for abstract differential equations with fractional derivatives. We show that the existence results in [Agarwal et al. (2009) [1], Belmekki and Benchohra (2010) [2], Darwish et al. (2009) [3], Hu et al. (2009) [4], Mophou and N`Guerekata (2009) [6,7], Mophou (2010) [8,9], Muslim (2009) [10], Pandey et al. (2009) [11], Rashid and El-Qaderi (2009) [12] and Tai and Wang (2009) [13]] are incorrect since the considered variation of constant formulas is not appropriate. In this note, we also consider a different approach to treat a general class of abstract fractional differential equations. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Background and purpose: Tinnitus is a frequent disorder which is very difficult to treat and there is compelling evidence that tinnitus is associated with functional alterations in the central nervous system. Targeted modulation of tinnitus-related cortical activity has been proposed as a promising new treatment approach. We aimed to investigate both immediate and long-term effects of low frequency (1 Hz) repetitive transcranial magnetic stimulation (rTMS) in patients with tinnitus and normal hearing. Methods: Using a parallel design, 20 patients were randomized to receive either active or placebo stimulation over the left temporoparietal cortex for five consecutive days. Treatment results were assessed by using the Tinnitus Handicap Inventory. Ethyl cysteinate dimmer-single photon emission computed tomography (SPECT) imaging was performed before and 14 days after rTMS. Results: After active rTMS there was significant improvement of the tinnitus score as compared to sham rTMS for up to 6 months after stimulation. SPECT measurements demonstrated a reduction of metabolic activity in the inferior left temporal lobe after active rTMS. Conclusion: These results support the potential of rTMS as a new therapeutic tool for the treatment of chronic tinnitus, by demonstrating a significant reduction of tinnitus complaints over a period of at least 6 months and significant reduction of neural activity in the inferior temporal cortex, despite the stimulation applied on the superior temporal cortex.