862 resultados para horizons of expectation
Resumo:
Scale mixtures of the skew-normal (SMSN) distribution is a class of asymmetric thick-tailed distributions that includes the skew-normal (SN) distribution as a special case. The main advantage of these classes of distributions is that they are easy to simulate and have a nice hierarchical representation facilitating easy implementation of the expectation-maximization algorithm for the maximum-likelihood estimation. In this paper, we assume an SMSN distribution for the unobserved value of the covariates and a symmetric scale mixtures of the normal distribution for the error term of the model. This provides a robust alternative to parameter estimation in multivariate measurement error models. Specific distributions examined include univariate and multivariate versions of the SN, skew-t, skew-slash and skew-contaminated normal distributions. The results and methods are applied to a real data set.
Resumo:
This thesis explores two aspects of mathematical reasoning: affect and gender. I started by looking at the reasoning of upper secondary students when solving tasks. This work revealed that when not guided by an interviewer, algorithmic reasoning, based on memorising algorithms which may or may not be appropriate for the task, was predominant in the students reasoning. Given this lack of mathematical grounding in students reasoning I looked in a second study at what grounds they had for different strategy choices and conclusions. This qualitative study suggested that beliefs about safety, expectation and motivation were important in the central decisions made during task solving. But are reasoning and beliefs gendered? The third study explored upper secondary school teachers conceptions about gender and students mathematical reasoning. In this study I found that upper secondary school teachers attributed gender symbols including insecurity, use of standard methods and imitative reasoning to girls and symbols such as multiple strategies especially on the calculator, guessing and chance-taking were assigned to boys. In the fourth and final study I found that students, both male and female, shared their teachers view of rather traditional feminities and masculinities. Remarkably however, this result did not repeat itself when students were asked to reflect on their own behaviour: there were some discrepancies between the traits the students ascribed as gender different and the traits they ascribed to themselves. Taken together the thesis suggests that, contrary to conceptions, girls and boys share many of the same core beliefs about mathematics, but much work is still needed if we should create learning environments that provide better opportunities for students to develop beliefs that guide them towards well-grounded mathematical reasoning.
Resumo:
Random effect models have been widely applied in many fields of research. However, models with uncertain design matrices for random effects have been little investigated before. In some applications with such problems, an expectation method has been used for simplicity. This method does not include the extra information of uncertainty in the design matrix is not included. The closed solution for this problem is generally difficult to attain. We therefore propose an two-step algorithm for estimating the parameters, especially the variance components in the model. The implementation is based on Monte Carlo approximation and a Newton-Raphson-based EM algorithm. As an example, a simulated genetics dataset was analyzed. The results showed that the proportion of the total variance explained by the random effects was accurately estimated, which was highly underestimated by the expectation method. By introducing heuristic search and optimization methods, the algorithm can possibly be developed to infer the 'model-based' best design matrix and the corresponding best estimates.
Resumo:
Recent investigations of various quantum-gravity theories have revealed a variety of possible mechanisms that lead to Lorentz violation. One of the more elegant of these mechanisms is known as Spontaneous Lorentz Symmetry Breaking (SLSB), where a vector or tensor field acquires a nonzero vacuum expectation value. As a consequence of this symmetry breaking, massless Nambu-Goldstone modes appear with properties similar to the photon in Electromagnetism. This thesis considers the most general class of vector field theories that exhibit spontaneous Lorentz violation-known as bumblebee models-and examines their candidacy as potential alternative explanations of E&M, offering the possibility that Einstein-Maxwell theory could emerge as a result of SLSB rather than of local U(1) gauge invariance. With this aim we employ Dirac's Hamiltonian Constraint Analysis procedure to examine the constraint structures and degrees of freedom inherent in three candidate bumblebee models, each with a different potential function, and compare these results to those of Electromagnetism. We find that none of these models share similar constraint structures to that of E&M, and that the number of degrees of freedom for each model exceeds that of Electromagnetism by at least two, pointing to the potential existence of massive modes or propagating ghost modes in the bumblebee theories.
Resumo:
Parametric term structure models have been successfully applied to innumerous problems in fixed income markets, including pricing, hedging, managing risk, as well as studying monetary policy implications. On their turn, dynamic term structure models, equipped with stronger economic structure, have been mainly adopted to price derivatives and explain empirical stylized facts. In this paper, we combine flavors of those two classes of models to test if no-arbitrage affects forecasting. We construct cross section (allowing arbitrages) and arbitrage-free versions of a parametric polynomial model to analyze how well they predict out-of-sample interest rates. Based on U.S. Treasury yield data, we find that no-arbitrage restrictions significantly improve forecasts. Arbitrage-free versions achieve overall smaller biases and Root Mean Square Errors for most maturities and forecasting horizons. Furthermore, a decomposition of forecasts into forward-rates and holding return premia indicates that the superior performance of no-arbitrage versions is due to a better identification of bond risk premium.
Resumo:
Why did house prices fall in 2007‐2009? This is the fundamental question to most Americans, and to those who lent them money. Most homeowners did not care why residential real estate prices rose. They assumed prices always rose, and they should simply enjoy their good fortune. It was not until prices began to fall that people were left searching for answers. How much did regulation or lack thereof play in the role of the devastation? To what degree did greed and unrealistic consumer expectation have on the real estate bubble? Using existing literature as well as face to face interviews of experienced leaders within the real estate industry in California who experienced both the up and down of the real estate cycle, the overarching purpose of this study is to investigate the opinions and beliefs of the leaders and drivers within the real estate industry about the cause of the real estate bubble that occurred sharply in 2008 . Specifically, this project will focus on the opinions of real estate industry leaders who worked in the center of the subprime universe located in Irvine, California, during 2004‐2008. Comparing the mainstream beliefs with the interviewees it is fair to say that the main finding in the mainstream beliefs are reflected very well with the finding of the subject’s opinion. The thesis is divided into 6 chapters starting with “introduction”, followed by chapter 2 “Literature Review”. Chapter 3 is “Research Methodology” followed by chapter 4 “Data Presentation”. Finally, the results are discussed in chapter 5 “Analysis and Discussion” and conclusions in Chapter 6.
Resumo:
This paper generates and organizes stylized facts related to the dynamics of selfemployment activities in Brazil. The final purpose is to help the design of policies to assist micro-entrepreneurial units. The 'first part of the paper uses as a main tool of analysis transitional data constructed from household surveys. The longitudinal information used covers three transition horizons: 1-month, 12-month and 5-year periods. Quantitative flows analysis assesses the main origins, destinies and various types of risks assumed by microentrepreneurial activities. Complementarily, logistic regressions provides evidence on the main characteristics and resources of micro-entrepreneurial units. In particular, we use the movements from self-employment to employer activities as measures of entrepreneurial success. We also use these transitions as measures of employment creation intensity within the self-employed segment.The second part of the paper explores various data sources. First, we attempt to analyze the life-cycle trajectories and determinants of self-employment. We use cohort data constructed from PME and qualitative data on financial and work history factors related to the opening of small bussiness from the informal firms survey implemented during 1994. Second, we apply a standart Mincerian wage equation approach to self-employment profits. This exerci se attempts to capture the correlation patterns between micro-entrepreneurial performance and a variety of firms leveI variables present in the 1994 Informal Survey. Finally, we use a a survey on the poor enterpreneurs of Rocinha favela as a laboratory to study poor entrepreneurs resources and behavior.In sum, the main questions pursued in the paper are: i) who are the Brazilian selfemployed?; ii) in particular: what is relative importance among the self-employed of subsistence activities versus those activities with growth and capital accumulation potential?; iii) what are the main static and dynamic determinants ofmicro-entrepreneurial success?; iv) what is the degree ofrisk associated with micro-entrepreneurial activities in Brazil?; v) What is the life-cycle profile of self-employment?; vi) what are the main constraints on poor entrepreneurs activities?.
Resumo:
We develop and empirically test a continuous time equilibrium model for the pricing of oil futures. The model provides a link between no-arbitrage models and expectation oriented models. It highlights the role of inventories for the identification of different pricing regimes. In an empirical study the hedging performance of our model is compared with five other one- and two-factor pricing models. The hedging problem considered is related to Metallgesellschaft´s strategy to hedge long-term forward commitments with short-term futures. The results show that the downside risk distribution of our inventory based model stochastically dominates those of the other models.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
The scalar sector of the simplest version of the 3-3-1 electroweak model is constructed with three Higgs triplets only. We show that a relation involving two of the constants of the model, two vacuum expectation values of the neutral scalars, and the mass of the doubly charged Higgs boson leads to important information concerning the signals of this scalar particle.
Resumo:
We describe the isolation and characterization of ten microsatellite loci from the red-winged tinamou (Rhynchotus rufescens) and also evaluated the cross-amplification of these loci and other ten loci previously developed for the great tinamou (Tinamus major) in other tinamous. Genetic variability was assessed using 24 individuals. Six loci were polymorphic with moderate to high number of alleles per locus (2-12 alleles) and showed expected heterozygosity (HE) ranging from 0.267 to 0.860. All loci conformed to the Hardy-Weinberg expectation and linkage disequilibrium was not significant for any pair of loci. This battery of polymorphic loci showed high paternity exclusion probability (0.986) and low genetic identity probability (4.95 x 10(-5)), proving to be helpful for parentage tests and population analyses in the red-winged tinamou. The cross-amplification was moderate where of the 160 locus/taxon combinations, 46 (28.75%) successfully amplified.
Resumo:
Although many glass-bearing horizons can be found in South American volcanic complexes or sedimentary series, only a relatively few tephra and obsidian-bearing volcanic fields have been studied using the fission-track (FT) dating method. Among them, the volcanics located in the Sierra de Guamani (east of Quito, Ecuador) were studied by several authors. Based upon their ages, obsidians group into three clusters: (1) very young obsidians, similar to 0.2Ma old, (2) intermediate-age obsidians, similar to 0.4- similar to 0.8 Ma old, and (3) older obsidians, similar to 1.4- similar to 1.6 Ma old. The FT method is also an efficient alternative technique for identification of the sources of prehistoric obsidian artefacts. Provenance studies carried out in South America have shown that the Sierra de Guamani obsidian occurrences were important sources of raw material for toot making during pre-Columbian times. Glasses originated from these sources were identified in sites distributed over relatively wide areas of Ecuador and Colombia.Only a few systematic studies on obsidians in other sectors were carried out. Nevertheless, very singular glasses have been recognised in South America, such as Macusanite (Peru) and obsidian Quiron (Argentina), which are being proposed as additional reference materials for FT dating. Analyses of tephra beds interstratified with sedimentary deposits revealed the performance of FT dating in tephrochronological studies. A remarkable example is the famous deposit outcropping at Farola Monte Hermoso, near Bahia Blanca (Buenos Aires Province), described for the first time by the middle of the 19th century by Charles Darwin.Considering the large number of volcanic glasses that were recognised in volcanic complexes and in sedimentary series, South America is a very promising region for the application of FT dating. The examples given above show that this technique may yield important results in different disciplinary fields. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Older adults have been facing usability problems every day, and with the increasing of life expectation those issues will be more and more frequent. The study of this group capacities and limitations could help designers to project systems more usable to everyone
Resumo:
Background: Since establishing universal free access to antiretroviral therapy in 1996, the Brazilian Health System has increased the number of centers providing HIV/AIDS outpatient care from 33 to 540. There had been no formal monitoring of the quality of these services until a survey of 336 AIDS health centers across 7 Brazilian states was undertaken in 2002. Managers of the services were asked to assess their clinics according to parameters of service inputs and service delivery processes. This report analyzes the survey results and identifies predictors of the overall quality of service delivery.Methods: The survey involved completion of a multiple-choice questionnaire comprising 107 parameters of service inputs and processes of delivering care, with responses assessed according to their likely impact on service quality using a 3-point scale. K-means clustering was used to group these services according to their scored responses. Logistic regression analysis was performed to identify predictors of high service quality.Results: The questionnaire was completed by 95.8% (322) of the managers of the sites surveyed. Most sites scored about 50% of the benchmark expectation. K-means clustering analysis identified four quality levels within which services could be grouped: 76 services (24%) were classed as level 1 (best), 53 (16%) as level 2 (medium), 113 (35%) as level 3 (poor), and 80 (25%) as level 4 (very poor). Parameters of service delivery processes were more important than those relating to service inputs for determining the quality classification. Predictors of quality services included larger care sites, specialization for HIV/AIDS, and location within large municipalities.Conclusion: The survey demonstrated highly variable levels of HIV/AIDS service quality across the sites. Many sites were found to have deficiencies in the processes of service delivery processes that could benefit from quality improvement initiatives. These findings could have implications for how HIV/AIDS services are planned in Brazil to achieve quality standards, such as for where service sites should be located, their size and staffing requirements. A set of service delivery indicators has been identified that could be used for routine monitoring of HIV/AIDS service delivery for HIV/AIDS in Brazil (and potentially in other similar settings).