965 resultados para Take Two Interactive, strategia aziendale, videogiochi,
Resumo:
Este trabalho tem por objetivo estimar o tempo que diferentes segmentos econômicos levam para responder a mudanças na taxa de juros. A investigação abrangeu o período de janeiro de 1990 a junho de 1996, compreendendo, portanto, o Plano Collor e os dois anos iniciais do Plano Real. O intuito de medir essa defasagem temporal prende-se ao fato de que, no Brasil, tornou-se hábito adotar, como parâmetros de tempo de resposta à política monetária, aqueles encontrados nos Estados Unidos. Naquele país, o tempo de reação da sociedade à política monetária tem sido estimado, freqüentemente, entre 6 e 12 meses. Este estudo encontrou que, no Brasil, dependendo do segmento (poupança, consumo ou produção), o tempo de resposta é imediato, podendo chegar a 2 ou 3 meses. Para estimar a defasagem de tempo, foi usada a técnica de correlações cruzadas entre variáveis pré-filtradas pelos respectivos ARIMAS.
Resumo:
Esta dissertação analisa a conexão existente entre o mercado de dívida pública e a política monetária no Brasil. Com base em um Vetor Auto-Regressivo (VAR), foram utilizadas duas proxies alternativas de risco inflacionário para mostrar que choques positivos no risco inflacionário elevam tanto as expectativas de inflação do mercado quanto os juros futuros do Swap Pré x DI. Em seguida, com base em modelo de inconsistência dinâmica de Blanchard e Missale (1994) e utilizando a metodologia de Johansen, constatou-se que um aumento nos juros futuros diminui a maturidade da dívida pública, no longo prazo. Os resultados levam a duas conclusões: o risco inflacionário 1) dificulta a colocação de títulos nominais (não-indexados) no mercado pelo governo, gerando um perfil de dívida menos longo do que o ideal e 2) torna a política monetária mais custosa.
Resumo:
Esta dissertação analisa a conexão existente entre o mercado de dívida pública e a política monetária no Brasil. Com base em um Vetor Auto-Regressivo (VAR), foram utilizadas duas proxies alternativas de risco inflacionário para mostrar que choques positivos no risco inflacionário elevam tanto as expectativas de inflação do mercado quanto os juros futuros do Swap Pré x DI. Em seguida, com base em modelo de inconsistência dinâmica de Blanchard e Missale (1994) e utilizando a metodologia de Johansen, constatou-se que um aumento nos juros futuros diminui a maturidade da dívida pública, no longo prazo. Os resultados levam a duas conclusões: o risco inflacionário 1) dificulta a colocação de títulos nominais (não-indexados) no mercado pelo governo, gerando um perfil de dívida menos longo do que o ideal e 2) torna a política monetária mais custosa.
Resumo:
In the first essay, "Determinants of Credit Expansion in Brazil", analyzes the determinants of credit using an extensive bank level panel dataset. Brazilian economy has experienced a major boost in leverage in the first decade of 2000 as a result of a set factors ranging from macroeconomic stability to the abundant liquidity in international financial markets before 2008 and a set of deliberate decisions taken by President Lula's to expand credit, boost consumption and gain political support from the lower social strata. As relevant conclusions to our investigation we verify that: credit expansion relied on the reduction of the monetary policy rate, international financial markets are an important source of funds, payroll-guaranteed credit and investment grade status affected positively credit supply. We were not able to confirm the importance of financial inclusion efforts. The importance of financial sector sanity indicators of credit conditions cannot be underestimated. These results raise questions over the sustainability of this expansion process and financial stability in the future. The second essay, “Public Credit, Monetary Policy and Financial Stability”, discusses the role of public credit. The supply of public credit in Brazil has successfully served to relaunch the economy after the Lehman-Brothers demise. It was later transformed into a driver for economic growth as well as a regulation device to force private banks to reduce interest rates. We argue that the use of public funds to finance economic growth has three important drawbacks: it generates inflation, induces higher loan rates and may induce financial instability. An additional effect is the prevention of market credit solutions. This study contributes to the understanding of the costs and benefits of credit as a fiscal policy tool. The third essay, “Bayesian Forecasting of Interest Rates: Do Priors Matter?”, discusses the choice of priors when forecasting short-term interest rates. Central Banks that commit to an Inflation Target monetary regime are bound to respond to inflation expectation spikes and product hiatus widening in a clear and transparent way by abiding to a Taylor rule. There are various reports of central banks being more responsive to inflationary than to deflationary shocks rendering the monetary policy response to be indeed non-linear. Besides that there is no guarantee that coefficients remain stable during time. Central Banks may switch to a dual target regime to consider deviations from inflation and the output gap. The estimation of a Taylor rule may therefore have to consider a non-linear model with time varying parameters. This paper uses Bayesian forecasting methods to predict short-term interest rates. We take two different approaches: from a theoretic perspective we focus on an augmented version of the Taylor rule and include the Real Exchange Rate, the Credit-to-GDP and the Net Public Debt-to-GDP ratios. We also take an ”atheoretic” approach based on the Expectations Theory of the Term Structure to model short-term interest. The selection of priors is particularly relevant for predictive accuracy yet, ideally, forecasting models should require as little a priori expert insight as possible. We present recent developments in prior selection, in particular we propose the use of hierarchical hyper-g priors for better forecasting in a framework that can be easily extended to other key macroeconomic indicators.
Resumo:
This paper explores the evolution of the cross-section income distribution in economies where endogenous neighborhood formation interacts with positive within-neighborhood feedback effects. We study an economy in which the economic success of adults is determined by the characteristics of the families in the neighborhood in which a person grows up. These feedbacks take two forms. First, the tax base of a neighborhood affects the leveI of education investment in offspring. Second, the effectiveness of education investment is affected by a neighborhood's in come distribution, reflecting factors such as role model or labor market connection effects. Conditions are developed under which endogenous stratification, defined as the tendency for families wi th similar incomes to choose to form common communities, will occur. When families are allowed to choose their neighborhoods, wealthy families will have an incentive to segregate themselves from the rest of the population. This resulting stratification is supported by house price differences between ricli and poor communities. Endogenous stratification can lead to pronounced intertemporal inequality as different families provide very different interaction environments for offspring. When the transformation of human capital into in come exhibits constant retums to scale, cross-section in come differences may also grow across time. As a result, endogenous stratification and neighborhood feedbacks can interact to produce long run inequality.
Resumo:
The transmedia storytelling is a phenomenom recently conceptualized theoretically (JENKINS, 2009), arising from ficcional mediatic products and disseminated as well by the use on other fields. This search aims to analyze how the transmedia storytellings can be applied to journalism on the basis of an specific genre, the reporting. To that, take as reference theoretical developments performed by brazilian authors (FECHINE et al., 2011, 2012, 2013), on the basis of televisive fiction, which culminated on concepts as transmediation and transmedia contents and deepened the comprehension and the research in this area. Thenceforth, this study propone a problematization and apropriation of this theoretic support for the journalism field, using, for that, a comprehension of journalistic production on a speech level, as well as its social practices (newsmaking). The empiric research also take two different paths. First, analyze a group of reportings, in which there is transmediation, in order to verifying the configuration of the transmedia phenomenom - more specifically of the transmedia storytelling - and its particularities to journalism. Then, develops an investigation, based on the etnographic method, of the productive routine on the special reporting section of the Jornal do Commercio (Recife/PE), aiming to investigate the conditions of transmediation in this range and the practices that favor and difficult the employment of transmedia storytelling. The result are, therefore, compared and related, with the goal of providing multidimensional view of the phenomenom
Resumo:
Preface This study was prepared for the Government of Jamaica following the significant physical damage and economic losses that the country sustained as a result of flood rains associated with the development of Hurricane Michelle. The Planning Institute of Jamaica (PIOJ) submitted a request for assistance in undertaking a social, environmental and economic impact assessment to the Economic Commission for Latin America and the Caribbean (ECLAC) on 14 November 2001. ECLAC responded with haste and modified its work plan to accommodate the request. A request for training in the use of the ECLAC Methodology to be delivered to personnel in Jamaica was deferred until the first quarter of 2002, as it was impossible to mount such an initiative at such short notice. This appraisal considers the consequences of the three instances of heavy rainfall that brought on the severe flooding and loss of property and livelihoods. The study was prepared by three members of the ECLAC Natural Disaster Damage Assessment Team over a period of one week in order to comply with the request that it be presented to the Prime Minister on 3 December 2001. The team has endeavoured to complete a workload that would take two weeks with a team of 15 members working assiduously with data already prepared in preliminary form by the national emergency stakeholders. There is need for training in disaster assessment as evidenced by the data collected by the Jamaican officials engaged in the exercise. Their efforts in the future will be more focused and productive after they have received training in the use of the ECLAC Methodology. This study undertakes a sectoral analysis leading to an overall assessment of the damage. It appraises the macroeconomic and social effects and proposes some guidelines for action including mitigating actions subsequent to the devastation caused by the weather system. The team is grateful for the efforts of the Office of Disaster Preparedness and Emergency Management (ODPEM), the associated government ministries and agencies, the Statistical Institute of Jamaica (STATIN), the Planning Institute of Jamaica and the Inter American Development Bank (IDB) for assistance rendered to the team. Indeed, it is the recommendation of the team that STATIN is poised to play a pivotal role in any disaster damage assessment and should be taken on board in that regard. The direct and indirect damages have been assessed in accordance with the methodology developed by ECLAC (1). The results presented are based on the mission's estimates. The study incorporates the information made available to the team and evidence collected in interviews and visits to affected locations. It is estimated that the magnitude of the losses exceeds the country's capacity to address reparations and mitigation without serious dislocation of its development trajectory. The government may wish to approach the international community for assistance in this regard. This appraisal is therefore designed to provide the government and the international community with guidelines for setting national and regional priorities in rehabilitation and reconstruction or resettlement programmes. A purely economic conception of the problem would be limited. A more integrated approach would have a human face and consider the alleviation of human suffering in the affected areas while attending to the economic and fiscal fallout of the disaster. Questions of improved physical planning, watershed management, early warning, emergency response and structural preparedness for evacuation and sheltering the vulnerable population are seen as important considerations for the post disaster phase. Special attention and priority should be placed on including sustainability and increased governance criteria in making social and productive investments, and on allocating resources to the reinforcing and retrofitting of vulnerable infrastructure, basic lifelines and services as part of the reconstruction and rehabilitation strategy. The Jamaican society and government face the opportunity of undertaking action with the benefit of revised paradigms, embarking on institutional, legal and structural reforms to reduce economic, social and environmental vulnerability. The history of flood devastation in the very areas of Portland and St. Mary shows a recurrence of flooding. Accounts of flooding from the earliest recorded accounts pertaining to 1837 are available. Recurrences in 1937, 1940, 1943 and 2001 indicate an ever-present probability of recurrence of similar events. The Government may wish to consider the probable consequences of a part of its population living in flood plains and address its position vis-à¶is land use and the probability of yet another recurrence of flood rains. (1) ECLAC/IDNDR, Manual for estimating the Socio-Economic Effects of Natural Disasters, May,1999.
Resumo:
Pós-graduação em História - FCLAS
Resumo:
Neste trabalho, investigamos o papel de componentes interativos, frequentemente utilizados na construção de interfaces computacionais educativas, na postura exploratória do estudante e na aprendizagem de conceitos matemáticos. Selecionamos para esta pesquisa os seguintes componentes: caixa de combinações (combo box) e campo de texto (text field). Do ponto de vista educacional, estes componentes têm papéis distintos: o primeiro orienta as escolhas do estudante durante um processo exploratório, enquanto que o segundo não oferece qualquer orientação. Para comparar o papel desses componentes, desenvolvemos duas interfaces computacionais interativas através das quais o estudante pode explorar o comportamento gráfico de uma função do primeiro grau. Ambas as interfaces são idênticas entre si, a menos do componente interativo empregado: em uma delas foi utilizado a caixa de combinações e em outra o campo de texto. Tanto a postura exploratória quanto o desempenho em testes de conhecimento foram avaliados a partir de medidas diretas registradas pelas próprias interfaces. A postura exploratória foi avaliada através do número e do tipo de interações do estudante com o componente interativo, sendo este registro uma das características singulares desta pesquisa, pois permite a observação de alguns comportamentos do estudante durante o processo de interação com a interface, e não somente antes e após a interação. Dentro da limitação da ferramenta de coleta de dados da presente pesquisa, a aprendizagem foi medida através da comparação do desempenho em testes de conhecimento aplicados antes e depois do uso dos componentes interativos pelos estudantes. Neste contexto, diferenças significativas no papel de cada componente na postura exploratória e na aprendizagem foram então observadas.
Resumo:
Grazie all’analisi del dibattito in corso nelle istituzioni europee inerente il tema della neutralità della rete si è avuto modo di capire come le web companies, debbano non solo interagire con il proprio mercato, ma anche con un ambiente in cui vi sono attori che definiscono regole ed influenzano i contesti sociali. Questo ambiente è composto da quei fattori sociali, politici e legali che interagiscono e influenzano dall’esterno i mercati e gli accordi privati. Esso comprende quindi tutte le interazioni economiche che hanno come intermediario un soggetto pubblico, in questo caso le istituzioni europee, e in cui, causa questa presenza, la natura degli accordi non risponde prettamente a logiche economiche ma più a quelle politiche.
Resumo:
La velocità di cambiamento che caratterizza il mercato ha posto l'attenzione di molte imprese alla Business Analysis. La raccolta, la gestione e l'analisi dei dati sta portando numerosi benefici in termini di efficienza e vantaggio competitivo. Questo è reso possibile dal supporto reale dei dati alla strategia aziendale. In questa tesi si propone un'applicazione della Business Analytics nell'ambito delle risorse umane. La valorizzazione del Capitale Intellettuale è fondamentale per il miglioramento della competitività dell'impresa, favorendo così la crescita e lo sviluppo dell'azienda. Le conoscenze e le competenze possono incidere sulla produttività, sulla capacità innovativa, sulle strategie e sulla propria reattività a comprendere le risorse e le potenzialità a disposizione e portano ad un aumento del vantaggio competitivo. Tramite la Social Network Analysis si possono studiare le relazioni aziendali per conoscere diversi aspetti della comunicazione interna nell'impresa. Uno di questi è il knowledge sharing, ovvero la condivisione della conoscenza e delle informazioni all'interno dell'organizzazione, tema di interesse nella letteratura per via delle potenzialità di crescita che derivano dal buon utilizzo di questa tecnica. L'analisi si è concentrata sulla mappatura e sullo studio del flusso di condivisione di due delle principali componenti della condivisione di conoscenza: sharing best prectices e sharing mistakes, nel caso specifico si è focalizzato lo studio sulla condivisione di miglioramenti di processo e di problematiche o errori. È stata posta una particolare attenzione anche alle relazioni informali all'interno dell'azienda, con l'obiettivo di individuare la correlazione tra i rapporti extra-professionali nel luogo di lavoro e la condivisione di informazioni e opportunità in un'impresa. L'analisi delle dinamiche comunicative e l'individuazione degli attori più centrali del flusso informativo, permettono di comprendere le opportunità di crescita e sviluppo della rete di condivisione. La valutazione delle relazioni e l’individuazione degli attori e delle connessioni chiave fornisce un quadro dettagliato della situazione all'interno dell'azienda.
Resumo:
Con questa tesi si è voluto evidenziare lo stretto rapporto di dipendenza delle aziende televisive con gli eventi sportivi al fine di poter guadagnare fette di mercato e quindi posizione strategiche rispetto ai concorrenti. Tra le aziende televisive e il settore sportivo c'è quindi un rapporto mezzo/contenuto decisamente complementare. Nessuno dei due può fare a meno dell'altro. Da una parte, come evidenziato, le emittenti televisive, durante la messa in onda di eventi sportivi, registrano elevati livelli di audience e ciò costituisce la "cassa di risonanza" preferita dagli sponsor. Dall'altra parte lo sport, che anno dopo anno ha visto aumentare la propria dipendenza dai network dal momento che la cessione dei diritti televisivi è diventata una delle maggiori fonti di reddito per le società sportive.
Perchè le imprese dovrebbero investire in mhealth? pro e contro della nuova frontiera del healthcare
Resumo:
La tesi si occupa della nuova frontiera del healthCare, la salute mobile o mHealth, analizzando al situazione di mercato, le previsioni future, quelli che possono essere i vantaggi di questo nuovo mercato e le barriere che ne possono limitare lo sviluppo. Integrando il lavoro con un analisi tramite le teorie strategiche imparate durante i corsi di studio
Resumo:
In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^
Resumo:
Questa tesi concerne quella che è una generalizzata tendenza verso la trasformazione digitale dei processi di business. Questa evoluzione, che implica l’utilizzo delle moderne tecnologie informatiche tra cui il Cloud Computing, le Big Data Analytics e gli strumenti Mobile, non è priva di insidie che vanno di volta in volta individuate ed affrontate opportunamente. In particolare si farà riferimento ad un caso aziendale, quello della nota azienda bolognese FAAC spa, ed alla funzione acquisti. Nell'ambito degli approvvigionamenti l'azienda sente la necessità di ristrutturare e digitalizzare il processo di richiesta di offerta (RdO) ai propri fornitori, al fine di consentire alla funzione di acquisti di concentrarsi sull'implementazione della strategia aziendale più che sull'operatività quotidiana. Si procede quindi in questo elaborato all'implementazione di un progetto di implementazione di una piattaforma specifica di e-procurement per la gestione delle RdO. Preliminarmente vengono analizzati alcuni esempi di project management presenti in letteratura e quindi viene definito un modello per la gestione del progetto specifico. Lo svolgimento comprende quindi: una fase di definizione degli obiettivi di continuità dell'azienda, un'analisi As-Is dei processi, la definizione degli obiettivi specifici di progetto e dei KPI di valutazione delle performance, la progettazione della piattaforma software ed infine alcune valutazioni relative ai rischi ed alle alternative dell'implementazione.