890 resultados para phi value analysis
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
The aim of this paper is to develop a flexible model for analysis of quantitative trait loci (QTL) in outbred line crosses, which includes both additive and dominance effects. Our flexible intercross analysis (FIA) model accounts for QTL that are not fixed within founder lines and is based on the variance component framework. Genome scans with FIA are performed using a score statistic, which does not require variance component estimation. RESULTS: Simulations of a pedigree with 800 F2 individuals showed that the power of FIA including both additive and dominance effects was almost 50% for a QTL with equal allele frequencies in both lines with complete dominance and a moderate effect, whereas the power of a traditional regression model was equal to the chosen significance value of 5%. The power of FIA without dominance effects included in the model was close to those obtained for FIA with dominance for all simulated cases except for QTL with overdominant effects. A genome-wide linkage analysis of experimental data from an F2 intercross between Red Jungle Fowl and White Leghorn was performed with both additive and dominance effects included in FIA. The score values for chicken body weight at 200 days of age were similar to those obtained in FIA analysis without dominance. CONCLUSION: We have extended FIA to include QTL dominance effects. The power of FIA was superior, or similar, to standard regression methods for QTL effects with dominance. The difference in power for FIA with or without dominance is expected to be small as long as the QTL effects are not overdominant. We suggest that FIA with only additive effects should be the standard model to be used, especially since it is more computationally efficient.
Resumo:
A major problem in e-service development is the prioritization of the requirements of different stakeholders. The main stakeholders are governments and their citizens, all of whom have different and sometimes conflicting requirements. In this paper, the prioritization problem is addressed by combining a value-based approach with an illustration technique. This paper examines the following research question: How can multiple stakeholder requirements be illustrated from a value-based perspective in order to be prioritizable? We used an e-service development case taken from a Swedish municipality to elaborate on our approach. Our contributions are: 1) a model of the relevant domains for requirement prioritization for government, citizens, technology, finances and laws and regulations; and 2) a requirement fulfillment analysis tool (RFA) that consists of a requirement-goal-value matrix (RGV), and a calculation and illustration module (CIM). The model reduces cognitive load, helps developers to focus on value fulfillment in e-service development and supports them in the formulation of requirements. It also offers an input to public policy makers, should they aim to target values in the design of e-services.
Resumo:
In the field of Information and Communication Technologies for Development (ICT4D) ICT use in education is well studied. Education is often seen as a pre-requisite for development and ICTs are believed to aid in education, e.g. to make it more accessible and to increase its quality. In this paper we study the access and use of ICT in a study circle (SC) education program in the south coast of Kenya. The study is qualitative reporting results based on interviews and observations with SC participants, government officers and SC coordinators and teachers. The study builds on the capability approach perspective of development where individuals’ opportunities and ability to live a life that they value are focused. The aim of the study is to investigate the capability outcomes enabled through the capability inputs access and use of ICT in education as well as the factors that enabled and/or restricted the outcomes. Findings show that many opportunities have been enabled such as an increase in the ability to generate an income, learning benefits, community development and basic human development (e.g. literacy and self-confidence). However, conversion factors such as a poorly developed infrastructure and poor IT literacy prevent many of the individuals from taking full advantage of the ICT and the opportunities it enables.
Resumo:
Even though assessing social marketing endeavors proves to be challenging, evaluators can learn from previous campaigns and identify which facets of social marketing events, programs and campaigns need to be improved. Additionally, by analyzing social movements and evaluating how they connect to social marketing, we can gain a clearer view on ways to ameliorate the field of social marketing. As social marketing becomes increasingly sophisticated and similar to commercial marketing, there is hope that social marketing can yield higher rates of success in the future. Friend and Levy (2002) claimed that it was nearly impossible to compare social marketing endeavors using quantitative criteria and advocate the use of qualitative methods. However, if social marketing scholars developed a more systematic paradigm to assess events, programs and campaigns employing a combination of both quantitative and qualitative methods, then it would be easier to establish which social marketing efforts generated more success than others. When there are too many confounding variables, conclusions cannot always be drawn and evaluations may not be viewed as legitimate. As a result, critics become skeptical of social marketing’s value and both the importance and credibility of social marketing decline. With the establishment of proper criteria and evaluation methods, social marketing can progress and initiate more social change.
Resumo:
Recent investigations of various quantum-gravity theories have revealed a variety of possible mechanisms that lead to Lorentz violation. One of the more elegant of these mechanisms is known as Spontaneous Lorentz Symmetry Breaking (SLSB), where a vector or tensor field acquires a nonzero vacuum expectation value. As a consequence of this symmetry breaking, massless Nambu-Goldstone modes appear with properties similar to the photon in Electromagnetism. This thesis considers the most general class of vector field theories that exhibit spontaneous Lorentz violation-known as bumblebee models-and examines their candidacy as potential alternative explanations of E&M, offering the possibility that Einstein-Maxwell theory could emerge as a result of SLSB rather than of local U(1) gauge invariance. With this aim we employ Dirac's Hamiltonian Constraint Analysis procedure to examine the constraint structures and degrees of freedom inherent in three candidate bumblebee models, each with a different potential function, and compare these results to those of Electromagnetism. We find that none of these models share similar constraint structures to that of E&M, and that the number of degrees of freedom for each model exceeds that of Electromagnetism by at least two, pointing to the potential existence of massive modes or propagating ghost modes in the bumblebee theories.
Resumo:
Cape Wind has proposed a wind farm of 130 turbines on Horseshoe Shoal in the center of Nantucket Sound. A prominent concern about the project is the impact the visibility of the turbines will have on the region's tourism industry and property values. It is feared that their presence will diminish the value of the pristine coastline that has attracted vacationers to Cape Cod for generations. In this project, we assess the extent to which Cape Cod, Martha's Vineyard, and Nantucket will be visually affected by the wind farm. It was completed using a Viewshed Analysis in the GIS program, ArcMap, from the surface, mean, and maximum height of the towers. These Viewsheds were combined to give a comprehensive perspective of which areas are able to see the highest percent of the wind farm. Finally, a weighted land use value was applied to the Viewshed to account for the impact of land use on the ability to see the project. The objective of this analysis is to provide a visual representation of how great an influence the wind farm will in fact have on Cape Cod.
Resumo:
Este estudo analisa como a classe de acionistas afeta o valor das empresas brasileiras listadas na bolsa de valores no ponto de vista da governança corporativa. O trabalho examina a interação entre o valor das empresas e cinco tipos de concentrações acionárias comumente presente em mercados emergentes: famílias, agentes públicos, investidores estrangeiros, executivos e investidores financeiros nacionais. A análise empírica demonstra que o mix e a concentração de participação acionária afeta significativamente o valor das empresas. Utilizando uma compilação única de dados em painel de 2004 a 2008, a presente pesquisa também desenvolve hipóteses sobre o efeito da participação em grupos econômicos para o valor das empresas. A investigação encontra evidências de que, apesar de sua importância para o desenvolvimento de empresas brasileiras, o capital familiar, instituições públicas, e investidores estrangeiros estão cedendo lugar a monitores mais especializados e menos concentrados, como executivos e instituições financeiras nacionais. Estes resultados indicam que a governança corporativa no Brasil pode estar alcançando níveis de maturidade mais elevados. Adicionalmente, apesar de não haver indicação da existência de correlação entre a participação em grupos econômicos e o valor das empresas, os resultados indicam que a presença de um tipo específico de acionista em uma empresa do grupo facilita investimentos futuros desta classe de acionista em outras empresas do mesmo grupo, sinalizando que os interesses acionários são provavelmente perpetuados dentro de uma mesma rede de empresas. Finalmente, a pesquisa demonstra que enquanto o capital familiar prefere investir em empresas com ativa mobilidade do capital, investidores internacionais e instituições públicas procuram investimentos em equity com menor mobilidade de capital, o que lhes garante mais transparência com relação ao uso dos recursos e fundos das empresas.
Resumo:
Housing is an important component of wealth for a typical household in many countries. The objective of this paper is to investigate the effect of real-estate price variation on welfare, trying to close a gap between the welfare literature in Brazil and that in the U.S., the U.K., and other developed countries. Our first motivation relates to the fact that real estate is probably more important here than elsewhere as a proportion of wealth, which potentially makes the impact of a price change bigger here. Our second motivation relates to the fact that real-estate prices boomed in Brazil in the last five years. Prime real estate in Rio de Janeiro and São Paulo have tripled in value in that period, and a smaller but generalized increase has been observed throughout the country. Third, we have also seen a recent consumption boom in Brazil in the last five years. Indeed, the recent rise of some of the poor to middle-income status is well documented not only for Brazil but for other emerging countries as well. Regarding consumption and real-estate prices in Brazil, one cannot imply causality from correlation, but one can do causal inference with an appropriate structural model and proper inference, or with a proper inference in a reduced-form setup. Our last motivation is related to the complete absence of studies of this kind in Brazil, which makes ours a pioneering study. We assemble a panel-data set for the determinants of non-durable consumption growth by Brazilian states, merging the techniques and ideas in Campbell and Cocco (2007) and in Case, Quigley and Shiller (2005). With appropriate controls, and panel-data methods, we investigate whether house-price variation has a positive effect on non-durable consumption. The results show a non-negligible significant impact of the change in the price of real estate on welfare consumption), although smaller then what Campbell and Cocco have found. Our findings support the view that the channel through which house prices affect consumption is a financial one.
Resumo:
The author argues that by applying problem-solving negotiation skills in the design of public policies, public administrators benefit from more effective and wide-ranging outcomes in the realization of their goals. In order to demonstrate this idea, the author analyzes how negotiation skills – such as identifying key actors and their interests, recognizing hardbargaining tactics and changing the players, knowing your best alternative, creating value and building trust – permeated and contributed to the success of the City of São Paulo’s Invoice Program (“Programa Nota Fiscal Paulistana”), a public policy aimed at combating tax evasion of service tax in the City of São Paulo.
Resumo:
The value of life methodology has been recently applied to a wide range of contexts as a means to evaluate welfare gains attributable to mortality reductions and health improvements. Yet, it suffers from an important methodological drawback: it does not incorporate into the analysis child mortality, individuals’ decisions regarding fertility, and their altruism towards offspring. Two interrelated dimensions of fertility choice are potentially essential in evaluating life expectancy and health related gains. First, child mortality rates can be very important in determining welfare in a context where individuals choose the number of children they have. Second, if altruism motivates fertility, life expectancy gains at any point in life have a twofold effect: they directly increase utility via increased survival probabilities, and they increase utility via increased welfare of the offspring. We develop a manageable way to deal with value of life valuations when fertility choices are endogenous and individuals are altruistic towards their offspring. We use the methodology developed in the paper to value the reductions in mortality rates experienced by the US between 1965 and 1995. The calculations show that, with a very conservative set of parameters, altruism and fertility can easily double the value of mortality reductions for a young adult, when compared to results obtained using the traditional value of life methodology.
Resumo:
The aim of this paper is to analyze extremal events using Generalized Pareto Distributions (GPD), considering explicitly the uncertainty about the threshold. Current practice empirically determines this quantity and proceeds by estimating the GPD parameters based on data beyond it, discarding all the information available be10w the threshold. We introduce a mixture model that combines a parametric form for the center and a GPD for the tail of the distributions and uses all observations for inference about the unknown parameters from both distributions, the threshold inc1uded. Prior distribution for the parameters are indirectly obtained through experts quantiles elicitation. Posterior inference is available through Markov Chain Monte Carlo (MCMC) methods. Simulations are carried out in order to analyze the performance of our proposed mode1 under a wide range of scenarios. Those scenarios approximate realistic situations found in the literature. We also apply the proposed model to a real dataset, Nasdaq 100, an index of the financiai market that presents many extreme events. Important issues such as predictive analysis and model selection are considered along with possible modeling extensions.
Resumo:
In June 2014 Brazil hosted the FIFA World Cup and in August 2016 Rio de Janeiro hosts the Summer Olympics. These two seminal sporting events will draw tens of thousands of air travelers through Brazil’s airports, airports that are currently in the midst of a national modernization program to address years of infrastructure neglect and insufficient capacity. Raising Brazil’s major airports up to the standards air travelers experience at major airports elsewhere in the world is more than just a case of building or remodeling facilities, processes must also be examined and reworked to enhance traveler experience and satisfaction. This research paper examines the key interface between airports and airline passengers—airport check-in procedures—according to how much value and waste there is associated with them. In particular, the paper makes use of a value stream mapping construct for services proposed by Martins, Cantanhede, and Jardim (2010). The uniqueness of this construct is that it attributes each activity with a certain percentage and magnitude of value or waste which can then be ordered and prioritized for improvement. Working against a fairly commonly expressed notion in Brazil that Brazil’s airports are inferior to the airports of economically advanced countries, the paper examines Rio’s two major airports, Galeão International and Santos Dumont in comparison to Washington D.C.’s Washington National and Dulles International airports. The paper seeks to accomplish three goals: - Determine whether there are differences in airport passenger check-in procedures between U.S. and Brazilian airports in terms of passenger value - Present options for Brazilian government or private sector authorities to consider adopting or implementing at Brazilian airports to maximize passenger value - Validate the Martins et al. construct for use in evaluating the airport check-in procedures Observations and analysis proved surprising in that all airports and service providers follow essentially the same check-in processes but execute them differently yet still result in similar overall performance in terms of value and waste. Although only a few activities are categorized as completely wasteful (and therefore removed in the revised value stream map of check-in activities), the weighting and categorization of individual activities according to their value (or waste) presents decision-makers a means to prioritize possible corrective actions. Various overall recommendations are presented based on this analysis. Most importantly, this paper demonstrates the viability of using the construct developed by Martins et al to examine airport operations, as well as its applicability to the study of other service industry processes.
Resumo:
Nos últimos anos o governo brasileiro tem adotado a postura de incentivo a projetos de infraestrutura, sendo as concessões rodoviárias um dos principais mecanismos. Muito se discute sobre a melhor forma de remuneração das concessionárias, sem que, ao mesmo tempo, os usuários não tenham um custo elevado e possam usufruir de bons serviçoes prestados.Essa discussão passa, principalmente, por uma análise de risco de tráfego, que hoje é inteiramente alocado as cconcessionárias. A metodologia utilizada nos últimos leilões segue uma exigência de Taxa Interna de Retorno ( TIR ) máxima, pelo Poder Concedente ( ANTT ), em termos reais e um prazo de concessão fixo. A partir de custos e investimentos estimados em determinada concessão, a ANTT define uma tarifa-teto a ser cobrada pela concessionária aos usuários através da TIR máxima exigida no projeto. Esta TIR é calculada com base no custo médio ponderado de capital ( WACC ) de empresas do setor, que tem ações negociadas na BM&F Bovespa, utilizando-se apenas dados domésticos. Neste trabalho é proposto um modelo alternativo, baseado no menor valor presente das receitas ( LPVR - Least Present Value of Revenues ). Neste modelo observamos que o risco de tráfego é bem menor para a concessionária, pois a concessão só se expira quando determinado nível de receitas exigido pela concessionária é atingido. Ou seja, para tal, é necessário um modelo de prazo flexível. Neste mecanismo, entretanto, com menor risco de tráfego, o upside e o downside, em termos de retorno, são menores em comparação com o modelo vigente. Utilizando este modelo, o Poder Concedente pode também definir um vencedor para o leilão ( a concessionária que ofertar o menor valor presente das receitas ) e também se utilizar da proposta de simulação de tráfegos para a definição de um prazo máximo para a concessão, em caso de implementação do mecanismo proposto.
Resumo:
This article presents a comprehensive and detailed overview of the international trade performance of the manufacturing industry in Brazil over the last decades, emphasizing its participation in Global Value Chains. It uses information from recent available global inputoutput tables such as WIOD (World Input-output database) and TIVA (Trade in Value Added, OECD) as well as complementary information from the GTAP 8 (Global Trade Analysis Project) database. The calculation of a broad set of value added type indicators allows a precise contextualization of the ongoing structural changes in the Brazilian industry, highlighting the relative isolation of its manufacturing sector from the most relevant international supply chains. This article also proposes a public policy discussion, presenting two case studies: the first one related to trade facilitation and the second one to preferential trade agreements. The main conclusions are twofold: first, the reduction of time delays at customs in Brazil may significantly improve the trade performance of its manufacturing industry, specially for the more capital intensive sectors which are generally the ones with greater potential to connection to global value chains; second, the extension of the concept of a “preferential trade partner” to the context of the global unbundling of production may pave the way to future trade policy in Brazil, particularly in the mapping of those partners whose bilateral trade relations with Brazil should receive greater priority by policy makers.