943 resultados para leave to proceed
Resumo:
A fundamental principle in practical nonlinear data modeling is the parsimonious principle of constructing the minimal model that explains the training data well. Leave-one-out (LOO) cross validation is often used to estimate generalization errors by choosing amongst different network architectures (M. Stone, "Cross validatory choice and assessment of statistical predictions", J. R. Stast. Soc., Ser. B, 36, pp. 117-147, 1974). Based upon the minimization of LOO criteria of either the mean squares of LOO errors or the LOO misclassification rate respectively, we present two backward elimination algorithms as model post-processing procedures for regression and classification problems. The proposed backward elimination procedures exploit an orthogonalization procedure to enable the orthogonality between the subspace as spanned by the pruned model and the deleted regressor. Subsequently, it is shown that the LOO criteria used in both algorithms can be calculated via some analytic recursive formula, as derived in this contribution, without actually splitting the estimation data set so as to reduce computational expense. Compared to most other model construction methods, the proposed algorithms are advantageous in several aspects; (i) There are no tuning parameters to be optimized through an extra validation data set; (ii) The procedure is fully automatic without an additional stopping criteria; and (iii) The model structure selection is directly based on model generalization performance. The illustrative examples on regression and classification are used to demonstrate that the proposed algorithms are viable post-processing methods to prune a model to gain extra sparsity and improved generalization.
Resumo:
Transient neural assemblies mediated by synchrony in particular frequency ranges are thought to underlie cognition. We propose a new approach to their detection, using empirical mode decomposition (EMD), a data-driven approach removing the need for arbitrary bandpass filter cut-offs. Phase locking is sought between modes. We explore the features of EMD, including making a quantitative assessment of its ability to preserve phase content of signals, and proceed to develop a statistical framework with which to assess synchrony episodes. Furthermore, we propose a new approach to ensure signal decomposition using EMD. We adapt the Hilbert spectrum to a time-frequency representation of phase locking and are able to locate synchrony successfully in time and frequency between synthetic signals reminiscent of EEG. We compare our approach, which we call EMD phase locking analysis (EMDPL) with existing methods and show it to offer improved time-frequency localisation of synchrony.
Resumo:
The translation of an ensemble of model runs into a probability distribution is a common task in model-based prediction. Common methods for such ensemble interpretations proceed as if verification and ensemble were draws from the same underlying distribution, an assumption not viable for most, if any, real world ensembles. An alternative is to consider an ensemble as merely a source of information rather than the possible scenarios of reality. This approach, which looks for maps between ensembles and probabilistic distributions, is investigated and extended. Common methods are revisited, and an improvement to standard kernel dressing, called ‘affine kernel dressing’ (AKD), is introduced. AKD assumes an affine mapping between ensemble and verification, typically not acting on individual ensemble members but on the entire ensemble as a whole, the parameters of this mapping are determined in parallel with the other dressing parameters, including a weight assigned to the unconditioned (climatological) distribution. These amendments to standard kernel dressing, albeit simple, can improve performance significantly and are shown to be appropriate for both overdispersive and underdispersive ensembles, unlike standard kernel dressing which exacerbates over dispersion. Studies are presented using operational numerical weather predictions for two locations and data from the Lorenz63 system, demonstrating both effectiveness given operational constraints and statistical significance given a large sample.
Resumo:
At the end of the 20th century, we can look back on a spectacular development of numerical weather prediction, which has, practically uninterrupted, been going on since the middle of the century. High-resolution predictions for more than a week ahead for any part of the globe are now routinely produced and anyone with an Internet connection can access many of these forecasts for anywhere in the world. Extended predictions for several seasons ahead are also being done — the latest El Niño event in 1997/1998 is an example of such a successful prediction. The great achievement is due to a number of factors including the progress in computational technology and the establishment of global observing systems, combined with a systematic research program with an overall strategy towards building comprehensive prediction systems for climate and weather. In this article, I will discuss the different evolutionary steps in this development and the way new scientific ideas have contributed to efficiently explore the computing power and in using observations from new types of observing systems. Weather prediction is not an exact science due to unavoidable errors in initial data and in the models. To quantify the reliability of a forecast is therefore essential and probably more so the longer the forecasts are. Ensemble prediction is thus a new and important concept in weather and climate prediction, which I believe will become a routine aspect of weather prediction in the future. The limit between weather and climate prediction is becoming more and more diffuse and in the final part of this article I will outline the way I think development may proceed in the future.
Resumo:
Widespread commercial use of the internet has significantly increased the volume and scope of data being collected by organisations. ‘Big data’ has emerged as a term to encapsulate both the technical and commercial aspects of this growing data collection activity. To date, much of the discussion of big data has centred upon its transformational potential for innovation and efficiency, yet there has been less reflection on its wider implications beyond commercial value creation. This paper builds upon normal accident theory (NAT) to analyse the broader ethical implications of big data. It argues that the strategies behind big data require organisational systems that leave them vulnerable to normal accidents, that is to say some form of accident or disaster that is both unanticipated and inevitable. Whilst NAT has previously focused on the consequences of physical accidents, this paper suggests a new form of system accident that we label data accidents. These have distinct, less tangible and more complex characteristics and raise significant questions over the role of individual privacy in a ‘data society’. The paper concludes by considering the ways in which the risks of such data accidents might be managed or mitigated.
Resumo:
The military offers a form of welfare-for-work but when personnel leave they lose this safety net, a loss exacerbated by the rollback neoliberalism of the contemporary welfare state. Increasingly the third sector has stepped in to address veterans’ welfare needs through operating within and across military/civilian and state/market/community spaces and cultures. In this paper we use both veterans’ and military charities’ experiences to analyse the complex politics that govern the liminal boundary zone of post-military welfare. Through exploring ‘crossing’ and ‘bridging’ we conceptualise military charities as ‘boundary subjects’, active yet dependent on the continuation of the civilian-military binary, and argue that the latter is better understood as a multidirectional, multiscalar and contextual continuum. Post-military welfare emerges as a competitive, confused and confusing assemblage that needs to be made more navigable in order to better support the ‘heroic poor’.
Resumo:
During a four month scholarly leave in United States of America, researchers designed a culturally appropriate prevention program for eating disorders (ED) for Brazilian adolescent girls. The program ""Se Liga na Nutricao"" was modeled on other effective programs identified in a research literature review and was carried out over eleven interactive sessions. It was positively received by the adolescents who suggested that it be part of school curricula. The girls reported that it helped them to develop critical thinking skills with regards to sociocultural norms about body image, food and eating practices. (Eating Weight Disord. 15: e270-e274, 2010). (C)2010, Editrice Kurtis
Resumo:
The paper analyses Gender Equality, Gender Equity and policies of combating inequality at workplace to make the society equal as a case study of Sweden. The aim of paper is see the gender equality, gender equity, discrimination against women at workplace and to describe the policies combating inequality in the welfare state of Sweden. This work highlights the gender equality in terms of institutionalizing gender equality, gender equity, gender and pay gap, parental leave, gender and the pension system and sexual behavior directed towards women and policies combating inequality to bring equality in society. For my research I used the secondary data the fact sheets, scientific literature, statistics from eurostate of Sweden and case studies about Swedish society and the theoretical explanation to explain the phenomena. To achieve my aim I used the combination of both qualitative and quantitative methods of research. I showed the empirical evidences of these phenomena from the Swedish society and theoretical analysis about equality and equity of gender in different wakes of life. I found an interesting conclusion that there are good policies and legislation to combat inequality to bring society but there are no policies to change the perception of society about male and female role.
Resumo:
Urban sprawl is a significant issue in the United States, one effect of which is the departure of the wealth from cities. This study examined the distribution of wealth in Erie County, New York, focused around Buffalo. The question is then raised, why do those with the money leave the city, and to where do they go? While this study does not attempt to explain all of the reasons, it does examine two significant issues: quality of public school education, and proximity to main highways with easy access to the city. Using ArcGIS, I was able to place the public high schools and their relative ranking over a distribution of per capita income. The results of this analysis show that the wealthiest areas are located within the best school districts. Moreover, the areas where the wealth accumulates are directly connected by major highways.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
We have recently suggested that the elevated T-maze (ETM) is not a useful test to study different types of anxiety in mice if a procedure similar to that originally validated for rats is employed. The present study investigated whether procedural (five exposures in the enclosed arm instead of three as originally described for rats) and structural (transparent walls instead of opaque walls) changes to the ETM leads to consistent inhibitory avoidance acquisition (IAA) and low escape latencies in mice. Results showed that five exposures to the ETM provoked consistent IAA, an effect that was independent of the ETM used. However, the ETM with transparent walls (ETMt) seemed to be more suitable for the study of conditioned anxiety (i.e. IAA) and unconditioned fear (escape) in mice, since IAA (low baseline latency with a gradual increase over subsequent exposures) and escape (low latency) profiles rendered it sensitive to the effects of anxiolytic and anxiogenic drugs. In addition to evaluation of drug effects on IAA and escape, the number of line crossings in the apparatus were used to control for locomotor changes. Results showed that whereas diazepam (1.0-2.0 mg/kg) and flumazenil (10-30 mg/kg) impaired IAA, FG 7142 (10-30 mg/kg) did not provoke any behavioral change. Significantly, none of these benzodiazepine (BDZ) receptor ligands modified escape latencies. The 5-HT1A partial receptor agonist buspirone (1.0-2.0 mg/kg) and the 5-HT releaser fenfluramine (0.15-0.30 mg/kg) impaired IAA and facilitated escape, while the full 5-HT1A receptor agonist, 8-OH-DPAT (0.05-0.1 mg/kg) and the 5-HT2B/2C receptor antagonist, SER 082 (0.5-2.0 mg/kg) failed to modify either response. mCPP (0.5-2.0 mg/kg), a 5-HT2B/2C receptor agonist, facilitated IAA but did not alter escape latency. Neither antidepressant utilized in the current study, imipramine (1.0-5.0 mg/kg) and moclobemide (3.0-10 mg/kg) affected IAA or escape performance in mice. The well-known anxiogenic drugs yohimbine (2.0-8.0 mg/kg) and caffeine (10-30 mg/kg) did not selectively affect IAA, although caffeine did impair escape latencies. Present results suggest the ETMt is useful for the study of conditioned anxiety in mice. However, upon proximal threats (e.g. open arm exposure), mice do not exhibit escape behavior as an immediate defensive strategy, suggesting that latency to leave open arm is not a useful parameter to evaluate unconditioned fear in this species. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This issue of Challenges examines the progress made thus far on childcare leave for parents —mothers and fathers— and turns a spotlight on pending debts in this regard. Few legislative or practical measures exist for satisfying the many types of early childhood care needs, and inequalities of origin are still rife. In order to meet those needs, the policy response must be aimed at ensuring universal satisfaction of children's right to care regardless of the formal employment status (or otherwise) of their parents, and the existing models of care from birth must be thoroughly reviewed.
Resumo:
Background: There are no reported cases of factitious or simulated obsessive compulsive disorder (OCD). However, over the last years, our clinic has come across a number of individuals that seem to exaggerate, mislabel or even intentionally produce obsessive and/or compulsive symptoms in order to be diagnosed with OCD.Methods: In this study, experienced clinicians working on a university-based OCD clinic were requested to provide clinical vignettes of patients who, despite having a formal diagnosis of OCD, were felt to display non-genuine forms of this condition.Results: Ten non-consecutive patients with a self-proclaimed diagnosis of OCD were identified and described. Although patients were diagnosed with OCD according to various structured interviews, they exhibited diverse combinations of the following features: (i) overly technical and/or doctrinaire description of their symptoms, (ii) mounting irritability, as the interviewer attempts to unveil the underlying nature of these descriptions; (iii) marked shifts in symptom patterns and disease course; (iv) an affirmative yes pattern of response to interview questions; (v) multiple Axis I psychiatric disorders; (vi) cluster B features; (vii) an erratic pattern of treatment response; and (viii) excessive or contradictory drug-related side effects.Conclusions: In sum, reliance on overly structured assessments conducted by insufficiently trained or naive personnel may result in invalid OCD diagnoses, particularly those that leave no room for clinical judgment. (C) 2014 Elsevier Inc. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)