999 resultados para thematic features


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efforts to combat childhood obesity in Australia are hampered by the lack of quality epidemiological data to routinely monitor the prevalence and distribution of the condition. This paper summarises the literature on issues relevant to childhood obesity monitoring and makes recommendations for implementing a school-based childhood obesity monitoring program in Australia. The primary purpose of such a program would be to collect population-level health data to inform both policy and the development and evaluation of community-based obesity prevention interventions. Recommendations are made for the types of data to be collected, data collection procedures and program management and evaluation. Data from an obesity monitoring program are crucial for directing and informing policies, practices and services, identifying subgroups at greatest risk of obesity and evaluating progress towards meeting obesity-related targets. Such data would also increase the community awareness necessary to foster change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the association between childhood maltreatment and the subsequent development of offending behavior is well documented, the association does not necessarily reflect a causal relationship. This paper provides a systematic review of prospective and longitudinal studies using official records of maltreatment to gain insights into the extent to which methodological variations are likely to influence the conclusions drawn about the likely relationship between maltreatment and offending. Sixty-two original studies met the inclusion criteria. These studies were assessed according to a set of seven methodological criteria: (1) inclusion of comparison groups, (2) the use of statistical controls, (3) valid outcome measures, (4) operationalization of maltreatment, (5) proper temporal order of associations, (6) data relating to unsubstantiated maltreatment, and (7) consideration of mediating and moderating factors. The strength of evidence in support of the maltreatment-offending association was influenced by a number of methodological factors. Despite the increasing sophistication of studies, there is a need to be mindful of how these factors are taken into account in future research in order to gain a deeper understanding of the adverse consequences of maltreatment and how this might influence outcomes and inform interventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the exponential growth of non-appointment-based web counselling, there is limited information on what happens in a single session intervention. This exploratory study, involving a thematic analysis of 85 counselling transcripts of people seeking help for problem gambling, aimed to describe the presentation and content of online conversations. Observed from the perspective of the client, we found that presentations were related to immediate help with a crisis and non-urgent assistance in developing strategies and skills. Almost all clients spent a great deal of time telling their story (i.e., the pattern, context, progression and impact of the problem, motivation for continuing and previous attempts to change) with less time spent exploring opportunities, readiness or self-efficacy related to change or relevant options and strategies. These findings provide important information that informs the application of traditional counselling approaches within web-based environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dynamically changing background (dynamic background) still presents a great challenge to many motion-based video surveillance systems. In the context of event detection, it is a major source of false alarms. There is a strong need from the security industry either to detect and suppress these false alarms, or dampen the effects of background changes, so as to increase the sensitivity to meaningful events of interest. In this paper, we restrict our focus to one of the most common causes of dynamic background changes: 1) that of swaying tree branches and 2) their shadows under windy conditions. Considering the ultimate goal in a video analytics pipeline, we formulate a new dynamic background detection problem as a signal processing alternative to the previously described but unreliable computer vision-based approaches. Within this new framework, we directly reduce the number of false alarms by testing if the detected events are due to characteristic background motions. In addition, we introduce a new data set suitable for the evaluation of dynamic background detection. It consists of real-world events detected by a commercial surveillance system from two static surveillance cameras. The research question we address is whether dynamic background can be detected reliably and efficiently using simple motion features and in the presence of similar but meaningful events, such as loitering. Inspired by the tree aerodynamics theory, we propose a novel method named local variation persistence (LVP), that captures the key characteristics of swaying motions. The method is posed as a convex optimization problem, whose variable is the local variation. We derive a computationally efficient algorithm for solving the optimization problem, the solution of which is then used to form a powerful detection statistic. On our newly collected data set, we demonstrate that the proposed LVP achieves excellent detection results and outperforms the best alternative adapted from existing art in the dynamic background literature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Brain Computer Interface (BCI) plays an important role in the communication between human and machines. This communication is based on the human brain signals. In these systems, users use their brain instead of the limbs or body movements to do tasks. The brain signals are analyzed and translated into commands to control any communication devices, robots or computers. In this paper, the aim was to enhance the performance of a brain computer interface (BCI) systems through better prosthetic motor imaginary tasks classification. The challenging part is to use only a single channel of electroencephalography (EEG). Arm movement imagination is the task of the user, where (s)he was asked to imagine moving his arm up or down. Our system detected the imagination based on the input brain signal. Some EEG quality features were extracted from the brain signal, and the Decision Tree was used to classify the participant's imagination based on the extracted features. Our system is online which means that it can give the decision as soon as the signal is given to the system (takes only 20 ms). Also, only one EEG channel is used for classification which reduces the complexity of the system which leads to fast performance. Hundred signals were used for testing, on average 97.4% of the up-down prosthetic motor imaginary tasks were detected correctly. This method can be used in many different applications such as: moving artificial limbs and wheelchairs due to it's high speed and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Named Entity Recognition (NER) is a crucial step in text mining. This paper proposes a new graph-based technique for representing unstructured medical text. The new representation is used to extract discriminative features that are able to enhance the NER performance. To evaluate the usefulness of the proposed graph-based technique, the i2b2 medication challenge data set is used. Specifically, the 'treatment' named entities are extracted for evaluation using six different classifiers. The F-measure results of five classifiers are enhanced, with an average improvement of up to 26% in performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blood biochemistry attributes form an important class of tests, routinely collected several times per year for many patients with diabetes. The objective of this study is to investigate the role of blood biochemistry for improving the predictive accuracy of the diagnosis of cardiac autonomic neuropathy (CAN) progression. Blood biochemistry contributes to CAN, and so it is a causative factor that can provide additional power for the diagnosis of CAN especially in the absence of a complete set of Ewing tests. We introduce automated iterative multitier ensembles (AIME) and investigate their performance in comparison to base classifiers and standard ensemble classifiers for blood biochemistry attributes. AIME incorporate diverse ensembles into several tiers simultaneously and combine them into one automatically generated integrated system so that one ensemble acts as an integral part of another ensemble. We carried out extensive experimental analysis using large datasets from the diabetes screening research initiative (DiScRi) project. The results of our experiments show that several blood biochemistry attributes can be used to supplement the Ewing battery for the detection of CAN in situations where one or more of the Ewing tests cannot be completed because of the individual difficulties faced by each patient in performing the tests. The results show that AIME provide higher accuracy as a multitier CAN classification paradigm. The best predictive accuracy of 99.57% has been obtained by the AIME combining decorate on top tier with bagging on middle tier based on random forest. Practitioners can use these findings to increase the accuracy of CAN diagnosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probabilistic topic models have become a standard in modern machine learning to deal with a wide range of applications. Representing data by dimensional reduction of mixture proportion extracted from topic models is not only richer in semantics interpretation, but could also be informative for classification tasks. In this paper, we describe the Topic Model Kernel (TMK), a topicbased kernel for Support Vector Machine classification on data being processed by probabilistic topic models. The applicability of our proposed kernel is demonstrated in several classification tasks with real world datasets. TMK outperforms existing kernels on the distributional features and give comparative results on nonprobabilistic data types.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the degree of short run and long run co-movement in U.S. sectoral output data by estimating sectoraI trends and cycles. A theoretical model based on Long and Plosser (1983) is used to derive a reduced form for sectoral output from first principles. Cointegration and common features (cycles) tests are performed; sectoral output data seem to share a relatively high number of common trends and a relatively low number of common cycles. A special trend-cycle decomposition of the data set is performed and the results indicate a very similar cyclical behavior across sectors and a very different behavior for trends. Indeed. sectors cyclical components appear as one. In a variance decomposition analysis, prominent sectors such as Manufacturing and Wholesale/Retail Trade exhibit relatively important transitory shocks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the commonly held belief that aggregate data display short-run comovement, there has been little discussion about the econometric consequences of this feature of the data. We use exhaustive Monte-Carlo simulations to investigate the importance of restrictions implied by common-cyclical features for estimates and forecasts based on vector autoregressive models. First, we show that the ìbestî empirical model developed without common cycle restrictions need not nest the ìbestî model developed with those restrictions. This is due to possible differences in the lag-lengths chosen by model selection criteria for the two alternative models. Second, we show that the costs of ignoring common cyclical features in vector autoregressive modelling can be high, both in terms of forecast accuracy and efficient estimation of variance decomposition coefficients. Third, we find that the Hannan-Quinn criterion performs best among model selection criteria in simultaneously selecting the lag-length and rank of vector autoregressions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite the belief, supported byrecentapplied research, thataggregate datadisplay short-run comovement, there has been little discussion about the econometric consequences ofthese data “features.” W e use exhaustive M onte-Carlo simulations toinvestigate theimportance ofrestrictions implied by common-cyclicalfeatures for estimates and forecasts based on vectorautoregressive and errorcorrection models. First, weshowthatthe“best” empiricalmodeldevelopedwithoutcommoncycles restrictions neednotnestthe“best” modeldevelopedwiththoserestrictions, duetothe use ofinformation criteria forchoosingthe lagorderofthe twoalternative models. Second, weshowthatthecosts ofignoringcommon-cyclicalfeatures inV A R analysis may be high in terms offorecastingaccuracy and e¢ciency ofestimates ofvariance decomposition coe¢cients. A lthough these costs are more pronounced when the lag orderofV A R modelsareknown, theyarealsonon-trivialwhenitis selectedusingthe conventionaltoolsavailabletoappliedresearchers. T hird, we…ndthatifthedatahave common-cyclicalfeatures andtheresearcherwants touseaninformationcriterium to selectthelaglength, theH annan-Q uinn criterium is themostappropriate, sincethe A kaike and theSchwarz criteriahave atendency toover- and under-predictthe lag lengthrespectivelyinoursimulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este documento constitui-se em uma dissertação de mestrado, requisito parcial para a obtenção do grau de Mestre em Gestão Empresaria e Pública. Este estudo procura mostrar que a adoção dessa nova tecnologia através de projetos de implantação de sistema de ERP não só mudam processos administrativos como também produtos, serviços e estruturas organizacionais e que a sua implantação se constitui em um grande projeto que envolve um número considerável de recursos e tempo das organizações. Este estudo procurar mostrar também que os impactos que tais projetos trazem, são mais fortemente sentidos ou não pela organização de acordo com uma série de fatores, entre eles, a resistência à mudança e o quanto a organização está preparada para enfrentar essas mudanças, o medo da perda do emprego pela adoção de uma nova tecnologia, problemas com a falta de comunicação das mudanças, questões relacionadas à cultura organizacional vigente, a falta de envolvimento da alta administração, entre outras. Para gerenciar todas essas variáveis, as organizações modernas adotam técnicas para garantir o sucesso da implantação dessas novas tecnologias. o estudo aqui proposto tem como objetivo determinar até que ponto a utilização de metodologias e de técnicas de Project Management é o suficiente para que esses projetos alcancem o sucesso esperado pelas organizações. A quantidade de variáveis que influenciam o resultado de um projeto são muitas e cada uma delas possui um papel importante que deve ser avaliado. As conclusões desta pesquisa demonstram que o sucesso de um projeto nem sempre se resume a atingir os objetivos inicialmente propostos, relativos ao cumprimento do prazo, escopo e custo de um projeto, conforme define a metodologia de Project Management. Outros aspectos considerados por essa metodologia, se melhor ou pior aplicados, também contribuem para o sucesso ou fracasso de um projeto de implantação de um sistema de ERP sendo o seu fracasso traduzido ou não, no cumprimento do prazo, do escopo inicialmente previsto ou no custo inicialmente calculado. Outros aspectos que não apenas a aplicação correta da metodologia de Project Management contribuem para os resultados alcançados pelo projeto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

No presente estudo são abordados dois dentre os aspectos essenciais dos testes projetivos: o conceito de projeção e os determinantes inerentes à situação de testagem ou seja, as propriedades estimuladoras do instrumento, o examinador e o contexto situacional. O conceito de projeção é analisado conforme as conceituações formuladas por diversos autores, questionando-se a adequação do termo teste "projetivo". Em relação às propriedades do estímulo, é discutida a hipótese usual segundo a qual o teste "projetivo" é ambíguo e carece de significação objetiva. No que diz respeito ao examinador e ao contexto situacional, é ressaltado o fato de que o processo de testagem implica em uma interação entre o examinador e o sujeito dentro do contexto em que o instrumento é aplicado. Esta dissertação se restringe ao Psicodiagnóstico de Rorschach, ao Teste de Apercepção Temática e ao Desenho da Figura Humana, por serem os mais utilizados no processo de diagnostico psicológico. É focalizado o conceito de projeção em sua conotação múltipla, que se presta a interpretações distorcidas acerca dos mecanismos psicológicos envolvidos durante a testagem projetiva. Busca-se também oferecer suporte à afirmação de que a resposta projetiva é multi determinada, sendo essencial à sua interpretação que seja considerada como resultante da interação entre estímulo, contexto e variáveis do sujeito.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual horizons. The data to be used consists of metal-commodity prices in a monthly frequency from 1957 to 2012 from the International Financial Statistics of the IMF on individual metal series. We will also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009) , which are available for download. Regarding short- and long-run comovement, we will apply the techniques and the tests proposed in the common-feature literature to build parsimonious VARs, which possibly entail quasi-structural relationships between different commodity prices and/or between a given commodity price and its potential demand determinants. These parsimonious VARs will be later used as forecasting models to be combined to yield metal-commodity prices optimal forecasts. Regarding out-of-sample forecasts, we will use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates to forecast the returns and prices of metal commodities. With the forecasts of a large number of models (N large) and a large number of time periods (T large), we will apply the techniques put forth by the common-feature literature on forecast combinations. The main contribution of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding forecasting, we show that models incorporating (short-run) commoncycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation. Still, in most cases, forecast combination techniques outperform individual models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this article is to study (understand and forecast) spot metal price levels and changes at monthly, quarterly, and annual frequencies. Data consists of metal-commodity prices at a monthly and quarterly frequencies from 1957 to 2012, extracted from the IFS, and annual data, provided from 1900-2010 by the U.S. Geological Survey (USGS). We also employ the (relatively large) list of co-variates used in Welch and Goyal (2008) and in Hong and Yogo (2009). We investigate short- and long-run comovement by applying the techniques and the tests proposed in the common-feature literature. One of the main contributions of this paper is to understand the short-run dynamics of metal prices. We show theoretically that there must be a positive correlation between metal-price variation and industrial-production variation if metal supply is held fixed in the short run when demand is optimally chosen taking into account optimal production for the industrial sector. This is simply a consequence of the derived-demand model for cost-minimizing firms. Our empirical evidence fully supports this theoretical result, with overwhelming evidence that cycles in metal prices are synchronized with those in industrial production. This evidence is stronger regarding the global economy but holds as well for the U.S. economy to a lesser degree. Regarding out-of-sample forecasts, our main contribution is to show the benefits of forecast-combination techniques, which outperform individual-model forecasts - including the random-walk model. We use a variety of models (linear and non-linear, single equation and multivariate) and a variety of co-variates and functional forms to forecast the returns and prices of metal commodities. Using a large number of models (N large) and a large number of time periods (T large), we apply the techniques put forth by the common-feature literature on forecast combinations. Empirically, we show that models incorporating (short-run) common-cycle restrictions perform better than unrestricted models, with an important role for industrial production as a predictor for metal-price variation.