525 resultados para Contingency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The current U.S. health care system faces numerous environmental challenges. To compete and survive, health care organizations are developing strategies to lower costs and increase efficiency and quality. All of these strategies require rapid and precise decision making by top level managers. The purpose of this study is to determine the relationship between the environment, made up of unfavorable market conditions and limited resources, and the work roles of top level managers, specifically in the settings of academic medical centers. Managerial work roles are based on the ten work roles developed by Henry Mintzberg, in his book, The Nature of Managerial Work (1973). ^ This research utilized an integrated conceptual framework made up of systems theory in conjunction with role, attribution and contingency theories to illustrate that four most frequently performed Mintzberg's work roles are affected by the two environment dimensions. The study sample consisted of 108 chief executive officers in academic medical centers throughout the United States. The methods included qualitative methods in the form of key informants and case studies and quantitative in the form of a survey questionnaire. Research analysis involved descriptive statistics, reliability tests, correlation, principal component and multivariate analyses. ^ Results indicated that under the market condition of increased revenue based on capitation, the work roles increased. In addition, under the environment dimension of limited resources, the work roles increased when uncompensated care increased while Medicare and non-government funding decreased. ^ Based on these results, a typology of health care managers in academic medical centers was created. Managers could be typed as a strategy-formulator, relationship-builder or task delegator. Therefore, managers who ascertained their types would be able to use this knowledge to build their strengths and develop their weaknesses. Furthermore, organizations could use the typology to identify appropriate roles and responsibilities of managers for their specific needs. Consequently, this research is a valuable tool for understanding health care managerial behaviors that lead to improved decision making. At the same time, this could enhance satisfaction and performance and enable organizations to gain the competitive edge . ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research is based on the premises that teams can be designed to optimize its performance, and appropriate team coordination is a significant factor to team outcome performance. Contingency theory argues that the effectiveness of a team depends on the right fit of the team design factors to the particular job at hand. Therefore, organizations need computational tools capable of predict the performance of different configurations of teams. This research created an agent-based model of teams called the Team Coordination Model (TCM). The TCM estimates the coordination load and performance of a team, based on its composition, coordination mechanisms, and job’s structural characteristics. The TCM can be used to determine the team’s design characteristics that most likely lead the team to achieve optimal performance. The TCM is implemented as an agent-based discrete-event simulation application built using JAVA and Cybele Pro agent architecture. The model implements the effect of individual team design factors on team processes, but the resulting performance emerges from the behavior of the agents. These team member agents use decision making, and explicit and implicit mechanisms to coordinate the job. The model validation included the comparison of the TCM’s results with statistics from a real team and with the results predicted by the team performance literature. An illustrative 26-1 fractional factorial experimental design demonstrates the application of the simulation model to the design of a team. The results from the ANOVA analysis have been used to recommend the combination of levels of the experimental factors that optimize the completion time for a team that runs sailboats races. This research main contribution to the team modeling literature is a model capable of simulating teams working on complex job environments. The TCM implements a stochastic job structure model capable of capturing some of the complexity not capture by current models. In a stochastic job structure, the tasks required to complete the job change during the team execution of the job. This research proposed three new types of dependencies between tasks required to model a job as a stochastic structure. These dependencies are conditional sequential, single-conditional sequential, and the merge dependencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pilot study posits that conducting a number of literacy workshops with teenage mothers translated into a greater number of appropriate booksharing skills implemented while reading to the child. The results of one- and two-way ANOVAs and of a contingency table with crosstabs are included.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In their discussion - Participative Budgeting and Participant Motivation: A Review of the Literature - by Frederick J. Demicco, Assistant Professor, School of Hotel, Restaurant and Institutional Management, The Pennsylvania State University and Steven J. Dempsey, Fulton F. Galer, Martin Baker, Graduate Assistants, College of Business at Virginia Polytechnic Institute and State University, the authors initially observe: “In recent years behavioral literature has stressed the importance of participation In goal-setting by those most directly affected by those goals. The common postulate is that greater participation by employees in the various management functions, especially the planning function, will lead to improved motivation, performance, coordination, and functional behavior. The authors analyze this postulate as it relates to the budgeting process and discuss whether or not participative budgeting has a significant positive impact on the motivations of budget participants.” In defining the concept of budgeting, the authors offer: “Budgeting is usually viewed as encompassing the preparation and adoption of a detailed financial operating plan…” In furthering that statement they also furnish that budgeting’s focus is to influence, in a positive way, how managers plan and coordinate the activities of a property in a way that will enhance their own performance. In essence, framing an organization within its described boundaries, and realizing its established goals. The authors will have you know, to control budget is to control operations. What kind of parallels can be drawn between the technical methods and procedures of budgeting, and managerial behavior? “In an effort to answer this question, Ronen and Livingstone have suggested that a fourth objective of budgeting exists, that of motivation,” say the authors with attribution. “The managerial function of motivation is manipulative in nature.” Demicco, Dempsey, Galer, and Baker attempt to quantify motivation as a psychological premise using the expectancy theory, which encompasses empirical support, intuitive appeal, and ease of application to the budgetary process. They also present you with House's Path-Goal model; essentially a mathematics type formula designed to gauge motivation. You really need to see this. The views of Argyris are also explored in particular detail. Although, the Argyris study was primarily aimed at manufacturing firms, and the effects on line-supervisors of the manufacturing budgets which were used to control and evaluate their performance, its application is relevant to the hospitality industry. As the title suggests, other notables in the field of behavioral motivation theory, and participation are also referenced. “Behavioral theory has been moving away from models of purported general applicability toward contingency models that are suited for particular situations,” say the authors in closing. “It is conceivable that some time in the future, contingency models will make possible the tailoring of budget strategies to individual budget holder personalities.”

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In their discussion - Professionalism and Ethics in Hospitality - by James R. Keiser, Associate Professor and John Swinton, Instructor, Hotel, Restaurant and Institutional Management, The Pennsylvania State University, Keiser and Swinton initially offer: “Referring to “the hospitality profession” necessitates thinking of the ethics of that profession and how ethics can be taught. The authors discuss what it means for the hospitality industry to be a profession.” The authors will have you know, a cursory nod to the term or description, profession and/or professional, is awarded to the hospitality industry at large; at least in an academic sense. Keiser and Swinton also want you to know that ethics, and professionalism are distinctly unique concepts, however, they are related. Their intangible nature does make them difficult, at best, to define, but ethics in contemporary hospitality has, to some degree, been charted and quantified. “We have left the caveat emptor era, and the common law, the Uniform Commercial Code, and a variety of local ordinances now dictate that the goods and services hospitality offers carry an implied warranty of merchantability,” the authors inform you. About the symbiotic relationship between ethics and professionalism, the authors say this: The less precise a code of ethics goes, the general rule, the fewer claims the group has to professional status.” The statement above may be considered a cornerstone principle. “However, the mere existence of an ethical code (or of professional status, for that matter) does not ensure ethical behavior in any group,” caution Keiser and Swinton. “Codes of ethics do not really define professionalism except as they adopt a group's special, arcane, exclusionary jargon. Worse, they can define the minimum, agreed-upon standards of conduct and thereby encourage ethical corner-cutting,” they further qualify the thought. And, in bridging academia, Keiser and Swinton say, “Equipped now with a sense of the ironies and ambiguities inherent in labeling any work "professional," we can turn to the problem of instilling in students a sense of what is professionally ethical. Students appear to welcome this kind of instruction, and while we would like to think their interest comes welling up from altruism and intellectual curiosity rather than drifting down as Watergate and malpractice fallout, our job is to teach, not to weigh the motives that bring us our students, and to provide a climate conducive to ethical behavior, not supply a separate answer for every contingency.” Keiser and Swinton illustrate their treatise on ethics via the hypothetical tale [stylized case study] of Cosmo Cuisiner, who manages the Phoenix, a large suburban restaurant. Cosmo is “…a typical restaurant manager faced with a series of stylized, over-simplified, but illustrative decisions, each with its own ethical skew for the students to analyze.” A shortened version of that case study is presented. Figure 1 outlines the State Restaurant Association Code of Ethics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This is an empirical study whose purpose was to examine the process of innovation adoption as an adaptive response by a public organization and its subunits existing under varying degrees of environmental uncertainty. Meshing organization innovation research and contingency theory to form a theoretical framework, an exploratory case study design was undertaken in a large, metropolitan government located in an area with the fourth highest prevalence rate of HIV/AIDS in the country. A number of environmental and organizational factors were examined for their influence upon decision making in the adoption/non-adoption as well as implementation of any number of AIDS-related policies, practices, and programs.^ The major findings of the study are as follows. For the county government itself (macro level), no AIDS-specific workplace policies have been adopted. AIDS activities (AIDS education, AIDS Task Force, AIDS Coordinator, etc.), adopted county-wide early in the epidemic, have all been abandoned. Worker infection rates, in the aggregate and throughout the epidemic have been small. As a result, absent co-worker conflict (isolated and negligible), no increase in employee health care costs, no litigation regarding discrimination, and no major impact on workforce productivity, AIDS has basically become a non-issue at the strategic core of the organization. At the departmental level, policy adoption decisions varied widely. Here the predominant issue is occupational risk, i.e., both objective as well as perceived. As expected, more AIDS-related activities (policies, practices, and programs) were found in departments with workers known to have significant risk for exposure to the AIDS virus (fire rescue, medical examiner, police, etc.). AIDS specific policies, in the form of OSHA's Bloodborn Pathogen Standard, took place primarily because they were legislatively mandated. Union participation varied widely, although not necessarily based upon worker risk. In several departments, the union was a primary factor bringing about adoption decisions. Additional factors were identified and included organizational presence of AIDS expertise, availability of slack resources, and the existence of a policy champion. Other variables, such as subunit size, centralization of decision making, and formalization were not consistent factors explaining adoption decisions. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This series of 5 single-subject studies used the operant conditioning paradigm to investigate, within the two-way influence process, how (a) contingent infant attention can reinforce maternal verbal behaviors during a period of mother-infant interaction and under subsequent experimental manipulation. Differential reinforcement was used to determine if it is possible that an infant attending to the mother (denoted by head-turns towards the image of the mother plus eye contact) increases (reinforces) the mother's verbal response (to a cue from the infant) upon which the infant behavior is contingent. There was also (b) an evaluation during the contrived parent-infant interaction for concurrent operant learning of infant vocal behavior via contingent verbal responding (reinforcement) implemented by the mother. Further, it was noted (c) whether or not the mother reported being aware that her responses were influenced by the infant's behavior. Findings showed: the operant conditioning of the maternal verbal behaviors were reinforced by contingent infant attention; and the operant conditioning of infant vocalizations was reinforced by contingent maternal verbal behaviors. No parent reported (1) being aware of the increase in their verbal response reinforced during operant conditioning of parental behavior nor a decrease in those responses during the DRA reversal phase, or (2) noticing a contingency between infant's and mother's response. By binomial 1-tail tests, the verbal-behavior patterns of the 5 mothers were conditioned by infant reinforcement (p < 0.02) and, concurrently, the vocal-response patterns of the 5 infants were conditioned by maternal reinforcement (p < 0.02). A program of systematic empirical research on the determinants of concurrent conditioning within mother-child interaction may provide a way to evaluate the differential effectiveness of interventions aimed at improving parent-child interactions. The work conducted in the present study is one step in this direction. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Investigation of the performance of engineering project organizations is critical for understanding and eliminating inefficiencies in today’s dynamic global markets. The existing theoretical frameworks consider project organizations as monolithic systems and attribute the performance of project organizations to the characteristics of the constituents. However, project organizations consist of complex interdependent networks of agents, information, and resources whose interactions give rise to emergent properties that affect the overall performance of project organizations. Yet, our understanding of the emergent properties in project organizations and their impact on project performance is rather limited. This limitation is one of the major barriers towards creation of integrated theories of performance assessment in project organizations. The objective of this paper is to investigate the emergent properties that affect the ability of project organization to cope with uncertainty. Based on the theories of complex systems, we propose and test a novel framework in which the likelihood of performance variations in project organizations could be investigated based on the environment of uncertainty (i.e., static complexity, dynamic complexity, and external source of disruption) as well as the emergent properties (i.e., absorptive capacity, adaptive capacity, and restorative capacity) of project organizations. The existence and significance of different dimensions of the environment of uncertainty and emergent properties in the proposed framework are tested based on the analysis of the information collected from interviews with senior project managers in the construction industry. The outcomes of this study provide a novel theoretical lens for proactive bottom-up investigation of performance in project organizations at the interface of emergent properties and uncertainty

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines changes in the Cuban family in the United States produced by time, migration, and the rise of new generations. The thesis will use a data set extracted from the 5% Public Use Microdata Series (PUMS) of the U.S. Decennial Census of Population for the years 1970, 1980 and 1990. Contingency table analysis and comparison of means were used to examine various family-related variables. The analysis points to changes in the traditional Cuban family towards less traditional family arrangements. The multigenerational feature of the Cuban household has diminished as the elderly have become independent and are more likely to be living on their own. Although female labor participation remains high, the occupational patterns of the first generation of Cuban women have diversified and a new trend has emerged for the second generation. The second generation of Cuban women demonstrates a strong inclination for white-collar occupations. Fertility rates remain low.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social contingency is the ability to connect social stimuli, such as those behaviors performed by oneself and those performed by others. Detecting social contingencies occurs by means of reciprocity through shared experiences with others. Reciprocity denotes a circumstance in which two individuals participate in a collaborative exchange, and is distinguished from an event in which two individuals engage in separate, unrelated activities. Specifically, reciprocity incorporates joint attention (JA), which occurs when two individuals simultaneously and visually attend to the same item. JA is facilitated by gazing and pointing, whereby one individual initiates the action and the second individual follows suit by, for example, gaze-following. However, little is known about the role the mother may play in the development of JA. The purpose of our study was to investigate social contingency between mothers and infants engaging in dyadic interactions. Thirty-three 12-month-old typically developing infants (M = 12.2, SD = .19; N = 19 males) were filmed for 10 minutes during free play with their mothers and toys provided by an experimenter. Reciprocity was measured by coding mother-infant interactions when a precise chain of events occurred: (1) mother initiated a bid by introducing a toy/activity or request to the infant, (2) infant accepted the bid/request by engaging in play with the given toy/activity, and (3) mother persisted by continuing to engage in play with said toy/activity. We computed a Pearson Correlation to assess the relation between the mothers’ initiations of JA and their infants’ responses to JA. We found a moderately positive correlation between the two variables (r= 0.37, p<.05). Our findings suggest that reciprocity, an important component of social relationships, during parent-infant dyads may serve as a scaffold for joint attention abilities, which have been linked to social and language development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation addresses the work of the memoirs of the potiguar writer Luís da Câmara Cascudo (1898 - 1986) from an integrated reading of four works that comprise: O Tempo e Eu (1968), Pequeno Manual do Doente Aprendiz (1969), Na Ronda do Tempo (1971) and Ontem (1972). Produced under the contingency of the modern movement and urban reform, memories Cascudo evoke the old landscapes of old, populated by those who belonged to the old romantic and provincial Natal that no longer exists, but which still survives in the idealized memory author and that is (re)constructed by him from a written permeated with touches of imagination and a sense of nostalgia. Seeking to analyze how is the process of building memoirist of Cascudo, as well as reflect on the role that memory plays in the (re)construction of a time and a lost space, we used the studies of Maurice Halbwachs (2006) and Ecléa Bosi (1994). Within this theoretical framework, we seek, above all, to understand not only how the lived experiences of Cascudo will work in this matter of his memory, but also as this will guide a writing that touches on the history and social frameworks of the past

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to ascertain how today’s international marketers can perform better on the global scene by harnessing spontaneity. Design/methodology/approach: The authors draw on contingency theory to develop a model of the spontaneity – international marketing performance relationship, and identify three potential moderators, namely, strategic planning, centralization, and market dynamism. The authors test the model via structural equation modeling with survey data from 197 UK exporters. Findings: The results indicate that spontaneity is beneficial to exporters in terms of enhancing profit performance. In addition, greater centralization and strategic planning strengthen the positive effects of spontaneity. However, market dynamism mitigates the positive effect of spontaneity on export performance (when customer needs are volatile, spontaneous decisions do not function as well in terms of ensuring success). Practical implications: Learning to be spontaneous when making export decisions appears to result in favorable outcomes for the export function. To harness spontaneity, export managers should look to develop company heuristics (increase centralization and strategic planning). Finally, if operating in dynamic export market environments, the role of spontaneity is weaker, so more conventional decision-making approaches should be adopted. Originality/value: The international marketing environment typically requires decisions to be flexible and fast. In this context, spontaneity could enable accelerated and responsive decision-making, allowing international marketers to realize superior performance. Yet, there is a lack of research on decision-making spontaneity and its potential for international marketing performance enhancement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We developed and tested a team level contingency model of innovation, integrating theories regarding work demands, team reflexivity - the extent to which teams collectively reflect upon their working methods and functioning -, and team innovation. We argued that highly reflexive teams will be more innovative than teams low in reflexivity when facing a demanding work environment. The relationships between team reflexivity, a demanding work environment (i.e. quality of the physical work environment and work load) and team innovation was examined among 98 primary health care teams (PHCTs) in the UK, comprised of 1137 individuals. Results showed that team reflexivity is positively related to team innovation, and that there is an interaction between team reflexivity, team level workload, and team innovation, such that when team level workload is high, combined with a high level of team reflexivity, team innovation is also higher. The complementary interaction between team reflexivity, quality of physical work environment, and team innovation, showed that when the quality of the work environment is low, combined with a high level of team reflexivity, team innovation was also higher. These results are discussed in the context of the need for team reflexivity and team innovation among teams at work facing high work demands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many modern applications fall into the category of "large-scale" statistical problems, in which both the number of observations n and the number of features or parameters p may be large. Many existing methods focus on point estimation, despite the continued relevance of uncertainty quantification in the sciences, where the number of parameters to estimate often exceeds the sample size, despite huge increases in the value of n typically seen in many fields. Thus, the tendency in some areas of industry to dispense with traditional statistical analysis on the basis that "n=all" is of little relevance outside of certain narrow applications. The main result of the Big Data revolution in most fields has instead been to make computation much harder without reducing the importance of uncertainty quantification. Bayesian methods excel at uncertainty quantification, but often scale poorly relative to alternatives. This conflict between the statistical advantages of Bayesian procedures and their substantial computational disadvantages is perhaps the greatest challenge facing modern Bayesian statistics, and is the primary motivation for the work presented here.

Two general strategies for scaling Bayesian inference are considered. The first is the development of methods that lend themselves to faster computation, and the second is design and characterization of computational algorithms that scale better in n or p. In the first instance, the focus is on joint inference outside of the standard problem of multivariate continuous data that has been a major focus of previous theoretical work in this area. In the second area, we pursue strategies for improving the speed of Markov chain Monte Carlo algorithms, and characterizing their performance in large-scale settings. Throughout, the focus is on rigorous theoretical evaluation combined with empirical demonstrations of performance and concordance with the theory.

One topic we consider is modeling the joint distribution of multivariate categorical data, often summarized in a contingency table. Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these notions of dimensionality reduction in the two paradigms. In Chapter 2, we derive several results relating the support of a log-linear model to nonnegative ranks of the associated probability tensor. Motivated by these findings, we propose a new collapsed Tucker class of tensor decompositions, which bridge existing PARAFAC and Tucker decompositions, providing a more flexible framework for parsimoniously characterizing multivariate categorical data. Taking a Bayesian approach to inference, we illustrate empirical advantages of the new decompositions.

Latent class models for the joint distribution of multivariate categorical, such as the PARAFAC decomposition, data play an important role in the analysis of population structure. In this context, the number of latent classes is interpreted as the number of genetically distinct subpopulations of an organism, an important factor in the analysis of evolutionary processes and conservation status. Existing methods focus on point estimates of the number of subpopulations, and lack robust uncertainty quantification. Moreover, whether the number of latent classes in these models is even an identified parameter is an open question. In Chapter 3, we show that when the model is properly specified, the correct number of subpopulations can be recovered almost surely. We then propose an alternative method for estimating the number of latent subpopulations that provides good quantification of uncertainty, and provide a simple procedure for verifying that the proposed method is consistent for the number of subpopulations. The performance of the model in estimating the number of subpopulations and other common population structure inference problems is assessed in simulations and a real data application.

In contingency table analysis, sparse data is frequently encountered for even modest numbers of variables, resulting in non-existence of maximum likelihood estimates. A common solution is to obtain regularized estimates of the parameters of a log-linear model. Bayesian methods provide a coherent approach to regularization, but are often computationally intensive. Conjugate priors ease computational demands, but the conjugate Diaconis--Ylvisaker priors for the parameters of log-linear models do not give rise to closed form credible regions, complicating posterior inference. In Chapter 4 we derive the optimal Gaussian approximation to the posterior for log-linear models with Diaconis--Ylvisaker priors, and provide convergence rate and finite-sample bounds for the Kullback-Leibler divergence between the exact posterior and the optimal Gaussian approximation. We demonstrate empirically in simulations and a real data application that the approximation is highly accurate, even in relatively small samples. The proposed approximation provides a computationally scalable and principled approach to regularized estimation and approximate Bayesian inference for log-linear models.

Another challenging and somewhat non-standard joint modeling problem is inference on tail dependence in stochastic processes. In applications where extreme dependence is of interest, data are almost always time-indexed. Existing methods for inference and modeling in this setting often cluster extreme events or choose window sizes with the goal of preserving temporal information. In Chapter 5, we propose an alternative paradigm for inference on tail dependence in stochastic processes with arbitrary temporal dependence structure in the extremes, based on the idea that the information on strength of tail dependence and the temporal structure in this dependence are both encoded in waiting times between exceedances of high thresholds. We construct a class of time-indexed stochastic processes with tail dependence obtained by endowing the support points in de Haan's spectral representation of max-stable processes with velocities and lifetimes. We extend Smith's model to these max-stable velocity processes and obtain the distribution of waiting times between extreme events at multiple locations. Motivated by this result, a new definition of tail dependence is proposed that is a function of the distribution of waiting times between threshold exceedances, and an inferential framework is constructed for estimating the strength of extremal dependence and quantifying uncertainty in this paradigm. The method is applied to climatological, financial, and electrophysiology data.

The remainder of this thesis focuses on posterior computation by Markov chain Monte Carlo. The Markov Chain Monte Carlo method is the dominant paradigm for posterior computation in Bayesian analysis. It has long been common to control computation time by making approximations to the Markov transition kernel. Comparatively little attention has been paid to convergence and estimation error in these approximating Markov Chains. In Chapter 6, we propose a framework for assessing when to use approximations in MCMC algorithms, and how much error in the transition kernel should be tolerated to obtain optimal estimation performance with respect to a specified loss function and computational budget. The results require only ergodicity of the exact kernel and control of the kernel approximation accuracy. The theoretical framework is applied to approximations based on random subsets of data, low-rank approximations of Gaussian processes, and a novel approximating Markov chain for discrete mixture models.

Data augmentation Gibbs samplers are arguably the most popular class of algorithm for approximately sampling from the posterior distribution for the parameters of generalized linear models. The truncated Normal and Polya-Gamma data augmentation samplers are standard examples for probit and logit links, respectively. Motivated by an important problem in quantitative advertising, in Chapter 7 we consider the application of these algorithms to modeling rare events. We show that when the sample size is large but the observed number of successes is small, these data augmentation samplers mix very slowly, with a spectral gap that converges to zero at a rate at least proportional to the reciprocal of the square root of the sample size up to a log factor. In simulation studies, moderate sample sizes result in high autocorrelations and small effective sample sizes. Similar empirical results are observed for related data augmentation samplers for multinomial logit and probit models. When applied to a real quantitative advertising dataset, the data augmentation samplers mix very poorly. Conversely, Hamiltonian Monte Carlo and a type of independence chain Metropolis algorithm show good mixing on the same dataset.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do infants learn word meanings? Research has established the impact of both parent and child behaviors on vocabulary development, however the processes and mechanisms underlying these relationships are still not fully understood. Much existing literature focuses on direct paths to word learning, demonstrating that parent speech and child gesture use are powerful predictors of later vocabulary. However, an additional body of research indicates that these relationships don’t always replicate, particularly when assessed in different populations, contexts, or developmental periods.

The current study examines the relationships between infant gesture, parent speech, and infant vocabulary over the course of the second year (10-22 months of age). Through the use of detailed coding of dyadic mother-child play interactions and a combination of quantitative and qualitative data analytic methods, the process of communicative development was explored. Findings reveal non-linear patterns of growth in both parent speech content and child gesture use. Analyses of contingency in dyadic interactions reveal that children are active contributors to communicative engagement through their use of gestures, shaping the type of input they receive from parents, which in turn influences child vocabulary acquisition. Recommendations for future studies and the use of nuanced methodologies to assess changes in the dynamic system of dyadic communication are discussed.