966 resultados para Nondegenerate Parametric Oscillation
Resumo:
Diagnosis Related Groups (DRG) are frequently used to standardize the comparison of consumption variables, such as length of stay (LOS). In order to be reliable, this comparison must control for the presence of outliers, i.e. values far removed from the pattern set by the majority of the data. Indeed, outliers can distort the usual statistical summaries, such as means and variances. A common practice is to trim LOS values according to various empirical rules, but there is little theoretical support for choosing between alternative procedures. This pilot study explores the possibility of describing LOS distributions with parametric models which provide the necessary framework for the use of robust methods.
Resumo:
The sample dimension, types of variables, format used for measurement, and construction of instruments to collect valid and reliable data must be considered during the research process. In the social and health sciences, and more specifically in nursing, data-collection instruments are usually composed of latent variables or variables that cannot be directly observed. Such facts emphasize the importance of deciding how to measure study variables (using an ordinal scale or a Likert or Likert-type scale). Psychometric scales are examples of instruments that are affected by the type of variables that comprise them, which could cause problems with measurement and statistical analysis (parametric tests versus non-parametric tests). Hence, investigators using these variables must rely on suppositions based on simulation studies or recommendations based on scientific evidence in order to make the best decisions.
Resumo:
China’s economic reforms, which began in 1978, resulted in remarkable income growth, and urban Chinese consumers have responded by dramatically increasing their consumption of meat, other livestock products, and fruits and by decreasing consumption of grain-based foods. Economic prosperity, a growing openness to international markets, and domestic policy reforms have changed the food marketing environment for Chinese consumers and may have contributed to shifts in consumer preferences. The objective of this paper is to uncover evidence of structural change in food consumption among urban residents in China. Both parametric and nonparametric methods are used to test for structural change in aggregate household data from 1981 to 2004. The tests provided a reasonably clear picture of changing food consumption over the study period.
Resumo:
Aim: We asked whether myocardial flow reserve (MFR) by Rb-82 cardiac PET improve the selection of patients eligible for invasive coronary angiography (ICA). Material and Methods: We enrolled 26 consecutive patients with suspected or known coronary artery disease who performed dynamic Rb-82 PET/CT and (ICA) within 60 days; 4 patients who underwent revascularization or had any cardiovascular events between PET and ICA were excluded. Myocardial blood flow at rest (rMBF), at stress with adenosine (sMBF) and myocardial flow reserve (MFR=sMBF/rMBF) were estimated using the 1-compartment Lortie model (FlowQuant) for each coronary arteries territories. Stenosis severity was assessed using computer-based automated edge detection (QCA). MFR was divided in 3 groups: G1:MFR<1.5, G2:1.5≤MFR<2 and G3:2≤MFR. Stenosis severity was graded as non-significant (<50% or FFR ≥0.8), intermediate (50%≤stenosis<70%) and severe (≥70%). Correlation between MFR and percentage of stenosis were assessed using a non-parametric Spearman test. Results: In G1 (44 vessels), 17 vessels (39%) had a severe stenosis, 11 (25%) an intermediate one, and 16 (36%) no significant stenosis. In G2 (13 vessels), 2 (15%) vessels presented a severe stenosis, 7 (54%) an intermediate one, and 4 (31%) no significant stenosis. In G3 (9 vessels), 0 vessel presented a severe stenosis, 1 (11%) an intermediate one, and 8 (89%) no significant stenosis. Of note, among 11 patients with 3-vessel low MFR<1.5 (G1), 9/11 (82%) had at least one severe stenosis and 2/11 (18%) had at least one intermediate stenosis. There was a significant inverse correlation between stenosis severity and MFR among all 66 territories analyzed (rho= -0.38, p=0.002). Conclusion: Patients with MFR>2 could avoid ICA. Low MFR (G1, G2) on a vessel-based analysis seems to be a poor predictor of severe stenosis severity. Patients with 3-vessel low MFR would benefit from ICA as they are likely to present a significant stenosis in at least one vessel.
Resumo:
Casos de fraudes têm ocorrido, frequentemente no mercado mundial. Diversos são os profissionais envolvidos nesses casos, inclusive os da contabilidade. Os escândalos contabilísticos, especialmente os mais famosos, como os incidido nas empresas Enron e Wordcom, acenderam para uma maior preocupação em relação a conduta ética dos profissionais da contabilidade. Como consequência há uma maior exigência quanto a transparência e a fidedignidade das informações prestadas por estes profissionais. Esta preocupação visa, sobretudo, manter a confiança das empresas, investidores, fornece-dores e sociedade em geral, de entre outras, na responsabilidade ética do contabilista, de-negrida pelo envolvimento nas fraudes detectadas. Desta forma, o presente estudo teve como objectivo verificar a conduta ética dos contabilistas, quando, no exercício da sua profissão, depararem com questões relacionadas a fraudes. Nesse sentido considerou-se factores que podem vir a influenciar o processo decisório ético de um indivíduo, demonstrados através do modelo de tomada de decisão, desenvolvido por Alves, quanto a motivar um indivíduo a cometer uma fraude, evidenciada através do modelo desenvolvido por Cressey. Tentando responder a questão norteadora desta pesquisa, executou-se a análise descritiva e estatística dos dados. Em relação a análise descritiva, foram elaboradas tabelas de frequência. Para a análise estatística dos dados foi utilizado o teste não paramétrico de Spearman. Os resultados demonstraram que a maioria dos contabilistas, da amostra pesquisada, reconhece a questão moral inserida nos cenários, e discordam dos actos dos agentes de cada cenário, e, ainda os classificam como graves ou muito graves. A pesquisa revelou maior aproximação desses profissionais a corrente teleológica, uma vez que a intenção de agir é mais influenciada por alguns factores como a oportunidade, a racionalização e principalmente a pressão. Alguns factores individuais apresentam influências sob o posicionamento ético dos contabilistas entrevistados nesta pesquisa. Cases of fraud have occurred, in the word market. Several are involved in these cases, including the accounting class. The accounting scandals, especially the most famous, such as focusing on companies and Enron Word Com, kindled to greater concern about the ethical conduct of professional accounting. As a result there is a greater demand on the transparency and reliability of information provide by these professionals This concern is aimed, primarily, to maintain the confidence of businesses, investor, suppliers and society, among others, the ethical responsibility of the meter, denigrated, by involvement in the fraud detected. Thus, this study aimed to verify the ethical conduct of accounts in when, in the exercise of their professional activities, is confronted with issues related to fraud. This is considered some factors that can both come to influence the ethical decision making of an individual, demonstrated by the model of decision making, developed by Alves, as a motivated individual to commit a fraudulent act, developed by Cressey. Seeking to answer question, guiding this study, performed to exploratory and confirmatory analysis of data. For exploratory data analysis were made table of frequencies. For confirmatory analysis of data, were used non parametric tests of Spearman. The results showed that the majority of accountings professionals, the sample, recognizing the moral issue included in the scenarios, disagrees the acts of agents of each scenario, and also classifies such acts as serious and very serious. However, we found that these accounting professionals tend to have a position more toward the teleological theory, since the intention to act is influenced by factors as opportunity, rationalization and particularly the pressure. Some individual factors also had influence on the ethical position of the professional interviewed is this research.
Resumo:
One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…
Resumo:
This paper studies how the strength of intellectual property rights (IPRs) affects investments in biological innovations when the value of an innovation is stochastically reduced to zero because of the evolution of pest resistance. We frame the problem as a research and development (R&D) investment game in a duopoly model of sequential innovation. We characterize the incentives to invest in R&D under two competing IPR regimes, which differ in their treatment of the follow-on innovations that become necessary because of pest adaptation. Depending on the magnitude of the R&D cost, ex ante firms might prefer an intellectual property regime with or without a “research exemption” provision. The study of the welfare function that also accounts for benefit spillovers to consumers—which is possible analytically under some parametric conditions, and numerically otherwise—shows that the ranking of the two IPR regimes depends critically on the extent of the R&D cost.
Resumo:
How much would output increase if underdeveloped economies were toincrease their levels of schooling? We contribute to the development accounting literature by describing a non-parametric upper bound on theincrease in output that can be generated by more schooling. The advantage of our approach is that the upper bound is valid for any number ofschooling levels with arbitrary patterns of substitution/complementarity.Another advantage is that the upper bound is robust to certain forms ofendogenous technology response to changes in schooling. We also quantify the upper bound for all economies with the necessary data, compareour results with the standard development accounting approach, andprovide an update on the results using the standard approach for a largesample of countries.
Resumo:
A general formalism on stochastic choice is presented. Tje Rationalizability and Recoverability (Identification) problems are discussed. For the identification issue parametric examples are analyzed by means of techniques of mathematical tomography (Random transforms).
Resumo:
In the mid-1980s, many European countries introduced fixed-term contracts.Since then their labor markets have become more dynamic. This paper studiesthe implications of such reforms for the duration distribution ofunemployment, with particular emphasis on the changes in the durationdependence. I estimate a parametric duration model using cross-sectionaldata drawn from the Spanish Labor Force Survey from 1980 to 1994 to analyzethe chances of leaving unemployment before and after the introduction offixed-term contracts. I find that duration dependence has increased sincesuch reform. Semi-parametric estimation of the model also shows that forlong spells, the probability of leaving unemployment has decreased sincesuch reform.
Resumo:
There are two fundamental puzzles about trade credit: why does it appearto be so expensive,and why do input suppliers engage in the business oflending money? This paper addresses and answers both questions analysingthe interaction between the financial and the industrial aspects of thesupplier-customer relationship. It examines how, in a context of limitedenforceability of contracts, suppliers may have a comparative advantageover banks in lending to their customers because they hold the extrathreat of stopping the supply of intermediate goods. Suppliers may alsoact as lenders of last resort, providing insurance against liquidityshocks that may endanger the survival of their customers. The relativelyhigh implicit interest rates of trade credit result from the existenceof default and insurance premia. The implications of the model areexamined empirically using parametric and nonparametric techniques on apanel of UK firms.
Resumo:
Recent studies have indicated that gamma band oscillations participate in the temporal binding needed for the synchronization of cortical networks involved in short-term memory and attentional processes. To date, no study has explored the temporal dynamics of gamma band in the early stages of dementia. At baseline, gamma band analysis was performed in 29 cases with mild cognitive impairment (MCI) during the n-back task. Based on phase diagrams, multiple linear regression models were built to explore the relationship between the cognitive status and gamma oscillation changes over time. Individual measures of phase diagram complexity were made using fractal dimension values. After 1 year, all cases were assessed neuropsychologically using the same battery. A total of 16 MCI patients showed progressive cognitive decline (PMCI) and 13 remained stable (SMCI). When adjusted for gamma values at lag -2, and -3 ms, PMCI cases displayed significantly lower average changes in gamma values than SMCI cases both in detection and 2-back tasks. Gamma fractal dimension of PMCI cases displayed significantly higher gamma fractal dimension values compared to SMCI cases. This variable explained 11.8% of the cognitive variability in this series. Our data indicate that the progression of cognitive decline in MCI is associated with early deficits in temporal binding that occur during the activation of selective attention processes.
Resumo:
This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.
Resumo:
In this paper we analyse the observed systematic differences incosts for teaching hospitals (THhenceforth) in Spain. Concernhas been voiced regarding the existence of a bias in thefinancing of TH s has been raised once prospective budgets arein the arena for hospital finance, and claims for adjusting totake into account the legitimate extra costs of teaching onhospital expenditure are well grounded. We focus on theestimation of the impact of teaching status on average cost. Weused a version of a multiproduct hospital cost function takinginto account some relevant factors from which to derive theobserved differences. We assume that the relationship betweenthe explanatory and the dependent variables follows a flexibleform for each of the explanatory variables. We also model theunderlying covariance structure of the data. We assumed twoqualitatively different sources of variation: random effects andserial correlation. Random variation refers to both general levelvariation (through the random intercept) and the variationspecifically related to teaching status. We postulate that theimpact of the random effects is predominant over the impact ofthe serial correlation effects. The model is estimated byrestricted maximum likelihood. Our results show that costs are 9%higher (15% in the case of median costs) in teaching than innon-teaching hospitals. That is, teaching status legitimatelyexplains no more than half of the observed difference in actualcosts. The impact on costs of the teaching factor depends on thenumber of residents, with an increase of 51.11% per resident forhospitals with fewer than 204 residents (third quartile of thenumber of residents) and 41.84% for hospitals with more than 204residents. In addition, the estimated dispersion is higher amongteaching hospitals. As a result, due to the considerable observedheterogeneity, results should be interpreted with caution. From apolicy making point of view, we conclude that since a higherrelative burden for medical training is under public hospitalcommand, an explicit adjustment to the extra costs that theteaching factor imposes on hospital finance is needed, beforehospital competition for inpatient services takes place.
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.