898 resultados para Bayesian shared component model
Resumo:
Perante a conjuntura atual, de crescente complexidade e mutação organizacional, a compreensão do comprometimento organizacional tem-se tornado fundamental na gestão de recursos humanos. Na presente investigação procura-se saber qual a influência do “Modelo das Três-Componentes”, proposto por Meyer e Allen (1991) na Estratégia Comportamental Voz do Modelo EVLN, proposto por Farrell e Rusbult (Farrell, 1983; Rusbult et al., 1982). Procura-se ainda saber se esta relação é mediada pela Satisfação Global com o Trabalho, conforme Hackman e Oldham (1980). Foi aplicado um questionário a uma amostra probabilística de 144 colaboradores na empresa EGEAC, composto pela escala de comprometimento organizacional de Meyer e Allen (1997), a escala voz do modelo de estratégias comportamentais EVLN de Farrell e Rusbult (Farrell, 1983; Rusbult et al., 1982) e a escala de satisfação global com o trabalho, de Hackman e Oldham (1980). Perante os resultados obtidos pode-se constatar que a componente organizacional afetiva e normativa determinam a estratégia comportamental voz, e que esta relação é positivamente mediada pela satisfação global com o trabalho. Como contributo para a gestão de recursos humanos o presente estudo demonstra a importância da gestão dos afetos e responsabilidades, visto a preponderância apresentada por estas componentes.
Resumo:
In the Eady model, where the meridional potential vorticity (PV) gradient is zero, perturbation energy growth can be partitioned cleanly into three mechanisms: (i) shear instability, (ii) resonance, and (iii) the Orr mechanism. Shear instability involves two-way interaction between Rossby edge waves on the ground and lid, resonance occurs as interior PV anomalies excite the edge waves, and the Orr mechanism involves only interior PV anomalies. These mechanisms have distinct implications for the structural and temporal linear evolution of perturbations. Here, a new framework is developed in which the same mechanisms can be distinguished for growth on basic states with nonzero interior PV gradients. It is further shown that the evolution from quite general initial conditions can be accurately described (peak error in perturbation total energy typically less than 10%) by a reduced system that involves only three Rossby wave components. Two of these are counterpropagating Rossby waves—that is, generalizations of the Rossby edge waves when the interior PV gradient is nonzero—whereas the other component depends on the structure of the initial condition and its PV is advected passively with the shear flow. In the cases considered, the three-component model outperforms approximate solutions based on truncating a modal or singular vector basis.
Resumo:
We present a procedure for estimating two quantities defining the spatial externality in discrete-choice commonly referred to as 'the neighbourhood effect'. One quantity, the propensity for neighbours to make the same decision, reflects traditional preoccupations; the other quantity, the magnitude of the neighbourhood itself, is novel. Because both quantities have fundamental bearing on the magnitude of the spatial externality, it is desirable to have a robust algorithm for their estimation. Using recent advances in Bayesian estimation and model comparison, we devise such an algorithm and illustrate its application to a sample of northern-Filipino smallholders. We determine that a significant, positive, neighbourhood effect exists; that, among the 12 geographical units comprising the sample, the neighbourhood spans a three-unit radius; and that policy prescriptions are significantly altered when calculations account for the spatial externality.
Resumo:
What happens when digital coordination practices are introduced into the institutionalized setting of an engineering project? This question is addressed through an interpretive study that examines how a shared digital model becomes used in the late design stages of a major station refurbishment project. The paper contributes by mobilizing the idea of ‘hybrid practices’ to understand the diverse patterns of activity that emerge to manage digital coordination of design. It articulates how engineering and architecture professions develop different relationships with the shared model; the design team negotiates paper-based practices across organizational boundaries; and diverse practitioners probe the potential and limitations of the digital infrastructure. While different software packages and tools have become linked together into an integrated digital infrastructure, these emerging hybrid practices contrast with the interactions anticipated in practice and policy guidance and presenting new opportunities and challenges for managing project delivery. The study has implications for researchers working in the growing field of empirical work on engineering project organizations as it shows the importance of considering, and suggests new ways to theorise, the introduction of digital coordination practices into these institutionalized settings.
Resumo:
A situation assessment uses reports from sensors to produce hypotheses about a situation at a level of aggregation that is of direct interest to a military commander. A low level of aggregation could mean forming tracks from reports, which is well documented in the tracking literature as track initiation and data association. In this paper there is also discussion on higher level aggregation; assessing the membership of tracks to larger groups. Ideas used in joint tracking and identification are extended, using multi-entity Bayesian networks to model a number of static variables, of which the identity of a target is one. For higher level aggregation a scheme for hypothesis management is required. It is shown how an offline clustering of vehicles can be reduced to an assignment problem.
Resumo:
We present, pedagogically, the Bayesian approach to composed error models under alternative, hierarchical characterizations; demonstrate, briefly, the Bayesian approach to model comparison using recent advances in Markov Chain Monte Carlo (MCMC) methods; and illustrate, empirically, the value of these techniques to natural resource economics and coastal fisheries management, in particular. The Bayesian approach to fisheries efficiency analysis is interesting for at least three reasons. First, it is a robust and highly flexible alternative to commonly applied, frequentist procedures, which dominate the literature. Second,the Bayesian approach is extremely simple to implement, requiring only a modest addition to most natural-resource economist tool-kits. Third, despite its attractions, applications of Bayesian methodology in coastal fisheries management are few.
Resumo:
The concept of a slowest invariant manifold is investigated for the five-component model of Lorenz under conservative dynamics. It is shown that Lorenz's model is a two-degree-of-freedom canonical Hamiltonian system, consisting of a nonlinear vorticity-triad oscillator coupled to a linear gravity wave oscillator, whose solutions consist of regular and chaotic orbits. When either the Rossby number or the rotational Froude number is small, there is a formal separation of timescales, and one can speak of fast and slow motion. In the same regime, the coupling is weak, and the Kolmogorov–Arnold-Moser theorem is shown to apply. The chaotic orbits are inherently unbalanced and are confined to regions sandwiched between invariant tori consisting of quasi-periodic regular orbits. The regular orbits generally contain free fast motion, but a slowest invariant manifold may be geometrically defined as the set of all slow cores of invariant tori (defined by zero fast action) that are smoothly related to such cores in the uncoupled system. This slowest invariant manifold is not global; in fact, its structure is fractal; but it is of nearly full measure in the limit of weak coupling. It is also nonlinearly stable. As the coupling increases, the slowest invariant manifold shrinks until it disappears altogether. The results clarify previous definitions of a slowest invariant manifold and highlight the ambiguity in the definition of “slowness.” An asymptotic procedure, analogous to standard initialization techniques, is found to yield nonzero free fast motion even when the core solutions contain none. A hierarchy of Hamiltonian balanced models preserving the symmetries in the original low-order model is formulated; these models are compared with classic balanced models, asymptotically initialized solutions of the full system and the slowest invariant manifold defined by the core solutions. The analysis suggests that for sufficiently small Rossby or rotational Froude numbers, a stable slowest invariant manifold can be defined for this system, which has zero free gravity wave activity, but it cannot be defined everywhere. The implications of the results for more complex systems are discussed.
Resumo:
The main objective of this paper is to propose a novel setup that allows estimating separately the welfare costs of the uncertainty stemming from business-cycle uctuations and from economic-growth variation, when the two types of shocks associated with them (respectively,transitory and permanent shocks) hit consumption simultaneously. Separating these welfare costs requires dealing with degenerate bivariate distributions. Levis Continuity Theorem and the Disintegration Theorem allow us to adequately de ne the one-dimensional limiting marginal distributions. Under Normality, we show that the parameters of the original marginal distributions are not afected, providing the means for calculating separately the welfare costs of business-cycle uctuations and of economic-growth variation. Our empirical results show that, if we consider only transitory shocks, the welfare cost of business cycles is much smaller than previously thought. Indeed, we found it to be negative - -0:03% of per-capita consumption! On the other hand, we found that the welfare cost of economic-growth variation is relatively large. Our estimate for reasonable preference-parameter values shows that it is 0:71% of consumption US$ 208:98 per person, per year.
Resumo:
Este trabalho analisa a importância dos fatores comuns na evolução recente dos preços dos metais no período entre 1995 e 2013. Para isso, estimam-se modelos cointegrados de VAR e também um modelo de fator dinâmico bayesiano. Dado o efeito da financeirização das commodities, DFM pode capturar efeitos dinâmicos comuns a todas as commodities. Além disso, os dados em painel são aplicados para usar toda a heterogeneidade entre as commodities durante o período de análise. Nossos resultados mostram que a taxa de juros, taxa efetiva do dólar americano e também os dados de consumo têm efeito permanente nos preços das commodities. Observa-se ainda a existência de um fator dinâmico comum significativo para a maioria dos preços das commodities metálicas, que tornou-se recentemente mais importante na evolução dos preços das commodities.
Resumo:
In this paper, we decompose the variance of logarithmic monthly earnings of prime age males into its permanent and transitory components, using a five-wave rotating panel from the Venezuelan “Encuesta de Hogares por Muestreo” from 1995 to 1997. As far as we know, this is the first time a variance components model is estimated for a developing country. We test several specifications and find that an error component model with individual random effects and first order serially correlated errors fits the data well. In the simplest model, around 22% of earnings variance is explained by the variance of permanent component, 77% by purely stochastic variation and the remaining 1% by serial correlation. These results contrast with studies from industrial countries where the permanent component is predominant. The permanent component is usually interpreted as the results of productivity characteristics of individuals whereas the transitory component is due to stochastic perturbations such as job and/or price instability, among others. Our findings may be due to the timing of the panel when occurred precisely during macroeconomic turmoil resulting from a severe financial crisis. The findings suggest that earnings instability is an important source of inequality in a region characterized by high inequality and macroeconomic instability.
Resumo:
Credit markets in emerging economies can be distinguished from those in advanced economies in many respects, including the collateral required for households to borrow. This work proposes a DSGE framework to analyze one peculiarity that characterizes the credit markets of some emerging markets: payroll-deducted personal loans. We add the possibility for households to contract long-term debt and compare two different types of credit constraints with one another, one based on housing and the other based on future income. We estimate the model for Brazil using a Bayesian technique. The model is able to solve a puzzle of the Brazilian economy: responses to monetary shocks at first appear to be strong but dissipate quickly. This occurs because income – and the amount available for loans – responds more rapidly to monetary shocks than housing prices. To smooth consumption, agents (borrowers) compensate for lower income and for borrowing by working more hours to repay loans and erase debt in a shorter time. Therefore, in addition to the income and substitution effects, workers consider the effects on their credit constraints when deciding how much labor to supply, which becomes an additional channel through which financial frictions affect the economy.
Resumo:
This work addresses the relationship between University-Firm aims to understand the model of shared management of R&D in petroleum of Petrobras with UFRN. This is a case study which sought to investigate whether the model of cooperation established by the two institutions brings innovation to generate technical-scientific knowledge and contribute to the coordination with other actors in the promotion of technological innovation. In addition to desk research the necessary data for analysis were obtained by sending questionnaires to the coordinators of projects in R&D at the company and university. Also, interviews were conducted with subjects who participated in the study since its inception to the present day. This case study were analysed through the Resource-Based View and Interorganizational Networks theory. The sample data also stands that: searches were aligned to the strategic planning and that 29% of R&D projects have been successful on the scope of the proposed objectives (of which 11% were incorporated into business processes); which was produced technical and scientific knowledge caracterized by hundreds of national and international publications; thesis, dissertations, eleven patents, and radical and incremental innovations; the partnership has also brought benefits to the academic processes induced by the improved infrastructure UFRN and changing the "attitude" of the university (currently with national prominence in research and staff training for the oil sector). As for the model, the technical point of view, although it has some problems, it follows that it is appropriate. From the viewpoint of the management model is criticized for containing an excess of bureaucracy. From the standpoint of strategic allocation of resources from the legal framework needs to be reassessed, because it is focused only on the college level and it is understood that should also reach the high school given the new reality of the oil sector in Brazil. For this it is desirable to add the local government to this partnership. The set of information leads to the conclusion that the model is identified and named as a innovation of organizational arrangement here known as Shared Management of R&D in petroleum of Petrobras with UFRN. It is said that the shared management model it is possible to exist, which is a simple and effective way to manage partnerships between firms and Science and Technology Institutions. It was created by contingencies arising from regulatory stand points and resource dependence. The partnership is the result of a process of Convergence, Construction and Evaluation supported by the tripod Simplicity, Systematization and Continuity, important factors for its consolidation. In practice an organizational arrangement was built to manage innovative university-industry partnership that is defined by a dyadic relationship on two levels (institutional and technical, therefore governance is hybrid), by measuring the quarterly meetings of systematic and standardized financial contribution proportional to the advancement of research. These details have led to the establishment of a point of interaction between the scientific and technological-business dimension, demystifying they are two worlds apart
Resumo:
The development of robots has shown itself as a very complex interdisciplinary research field. The predominant procedure for these developments in the last decades is based on the assumption that each robot is a fully personalized project, with the direct embedding of hardware and software technologies in robot parts with no level of abstraction. Although this methodology has brought countless benefits to the robotics research, on the other hand, it has imposed major drawbacks: (i) the difficulty to reuse hardware and software parts in new robots or new versions; (ii) the difficulty to compare performance of different robots parts; and (iii) the difficulty to adapt development needs-in hardware and software levels-to local groups expertise. Large advances might be reached, for example, if physical parts of a robot could be reused in a different robot constructed with other technologies by other researcher or group. This paper proposes a framework for robots, TORP (The Open Robot Project), that aims to put forward a standardization in all dimensions (electrical, mechanical and computational) of a robot shared development model. This architecture is based on the dissociation between the robot and its parts, and between the robot parts and their technologies. In this paper, the first specification for a TORP family and the first humanoid robot constructed following the TORP specification set are presented, as well as the advances proposed for their improvement.
Resumo:
Among the traits of economic importance to dairy cattle livestock those related to sexual precocity and longevity of the herd are essential to the success of the activity, because the stayability time of a cow in a herd is determined by their productive and reproductive lives. In Brazil, there are few studies about the reproductive efficiency of Swiss-Brown cows and no study was found using the methodology of survival analysis applied to this breed. Thus, in the first chapter of this study, the age at first calving from Swiss-Brown heifers was analyzed as the time until the event by the nonparametric method of Kaplan-Meier and the gamma shared frailty model, under the survival analysis methodology. Survival and hazard rate curves associated with this event were estimated and identified the influence of covariates on such time. The mean and median times at the first calving were 987.77 and 1,003 days, respectively, and significant covariates by the Log-Rank test, through Kaplan-Meier analysis, were birth season, calving year, sire (cow s father) and calving season. In the analysis by frailty model, the breeding values and the frailties of the sires (fathers) for the calving were predicted modeling the risk function of each cow as a function of the birth season as fixed covariate and sire as random covariate. The frailty followed the gamma distribution. Sires with high and positive breeding values possess high frailties, what means shorter survival time of their daughters to the event, i.e., reduction in the age at first calving of them. The second chapter aimed to evaluate the longevity of dairy cows using the nonparametric Kaplan-Meier and the Cox and Weibull proportional hazards models. It were simulated 10,000 records of the longevity trait from Brown-Swiss cows involving their respective times until the occurrence of five consecutive calvings (event), considered here as typical of a long-lived cow. The covariates considered in the database were age at first calving, herd and sire (cow s father). All covariates had influence on the longevity of cows by Log-Rank and Wilcoxon tests. The mean and median times to the occurrence of the event were 2,436.285 and 2,437 days, respectively. Sires that have higher breeding values also have a greater risk of that their daughters reach the five consecutive calvings until 84 months
Resumo:
The constant increase of complexity in computer applications demands the development of more powerful hardware support for them. With processor's operational frequency reaching its limit, the most viable solution is the use of parallelism. Based on parallelism techniques and the progressive growth in the capacity of transistors integration in a single chip is the concept of MPSoCs (Multi-Processor System-on-Chip). MPSoCs will eventually become a cheaper and faster alternative to supercomputers and clusters, and applications developed for these high performance systems will migrate to computers equipped with MP-SoCs containing dozens to hundreds of computation cores. In particular, applications in the area of oil and natural gas exploration are also characterized by the high processing capacity required and would benefit greatly from these high performance systems. This work intends to evaluate a traditional and complex application of the oil and gas industry known as reservoir simulation, developing a solution with integrated computational systems in a single chip, with hundreds of functional unities. For this, as the STORM (MPSoC Directory-Based Platform) platform already has a shared memory model, a new distributed memory model were developed. Also a message passing library has been developed folowing MPI standard