876 resultados para statistical learning mechanisms
Resumo:
Many studies have widely accepted the assumption that learning processes can be promoted when teaching styles and learning styles are well matched. In this study, the synergy between learning styles, learning patterns, and gender as a selected demographic feature and learners’ performance were quantitatively investigated in a blended learning setting. This environment adopts a traditional teaching approach of ‘one-sizefits-all’ without considering individual user’s preferences and attitudes. Hence, evidence can be provided about the value of taking such factors into account in Adaptive Educational Hypermedia Systems (AEHSs). Felder and Soloman’s Index of Learning Styles (ILS) was used to identify the learning styles of 59 undergraduate students at the University of Babylon. Five hypotheses were investigated in the experiment. Our findings show that there is no statistical significance in some of the assessed factors. However, processing dimension, the total number of hits on course website and gender indicated a statistical significance on learners’ performance. This finding needs more investigation in order to identify the effective factors on students’ achievement to be considered in Adaptive Educational Hypermedia Systems (AEHSs).
Resumo:
House builders play a key role in controlling the quality of new homes in the UK. The UK house building sector is, however, currently facing pressures to expand supply as well as conform to tougher low carbon planning and Building Regulation requirements; primarily in the areas of sustainability. There is growing evidence that the pressure the UK house building industry is currently under may be eroding build quality and causing an increase in defects. It is found that the prevailing defect literature is limited to the causes, pathology and statistical analysis of defects (and failures). The literature does not extend to examine how house builders individually and collectively, in practice, collect and learn from defects experience in order to reduce the prevalence of defects in future homes. The theoretical lens for the research is organisational learning. This paper contributes to our understanding of organisational learning in construction through a synthesis of current literature. Further, a suitable organisational learning model is adopted. The paper concludes by reporting the research design of an ongoing collaborative action research project with the National House Building Council (NHBC), focused on developing a better understanding of house builders’ localised defects analysis procedures and learning processes.
Resumo:
The paper presents research with small and medium enterprise (SME) owners who have participated in a leadership development programme. The primary focus of the paper is on learning transfer and factors affecting it, arguing that entrepreneurs must engage in ‘action’ in order to ‘learn’ and that under certain conditions they may transfer learning to their firm. The paper draws on data from 19 focus groups undertaken from 2010 to 2012, involving 51 participants in the LEAD Wales programme. It considers the literatures exploring learning transfer and develops a conceptual framework, outlining four areas of focus for entrepreneurial learning. Utilising thematic analysis, it describes and evaluates what (actual facts and information) and how (techniques, styles of learning) participants transfer and what actions they take to improve the business and develop their people. The paper illustrates the complex mechanisms involved in this process and concludes that action learning is a method of facilitating entrepreneurial learning which is able to help address some of the problems of engagement, relevance and value that have been highlighted previously. The paper concludes that the efficacy of an entrepreneurial learning intervention in SMEs may depend on the effectiveness of learning transfer.
Resumo:
Extinction-resistant fear is considered to be a central feature of pathological anxiety. Here we sought to determine if individual differences in Intolerance of Uncertainty (IU), a potential risk factor for anxiety disorders, underlies compromised fear extinction. We tested this hypothesis by recording electrodermal activity in 38 healthy participants during fear acquisition and extinction. We assessed the temporality of fear extinction, by examining early and late extinction learning. During early extinction, low IU was associated with larger skin conductance responses to learned threat vs. safety cues, whereas high IU was associated with skin conductance responding to both threat and safety cues, but no cue discrimination. During late extinction, low IU showed no difference in skin conductance between learned threat and safety cues, whilst high IU predicted continued fear expression to learned threat, indexed by larger skin conductance to threat vs. safety cues. These findings suggest a critical role of uncertainty-based mechanisms in the maintenance of learned fear.
Resumo:
Sociable robots are embodied agents that are part of a heterogeneous society of robots and humans. They Should be able to recognize human beings and each other, and to engage in social, interactions. The use of a robotic architecture may strongly reduce the time and effort required to construct a sociable robot. Such architecture must have structures and mechanisms to allow social interaction. behavior control and learning from environment. Learning processes described oil Science of Behavior Analysis may lead to the development of promising methods and Structures for constructing robots able to behave socially and learn through interactions from the environment by a process of contingency learning. In this paper, we present a robotic architecture inspired from Behavior Analysis. Methods and structures of the proposed architecture, including a hybrid knowledge representation. are presented and discussed. The architecture has been evaluated in the context of a nontrivial real problem: the learning of the shared attention, employing an interactive robotic head. The learning capabilities of this architecture have been analyzed by observing the robot interacting with the human and the environment. The obtained results show that the robotic architecture is able to produce appropriate behavior and to learn from social interaction. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
We study opinion dynamics in a population of interacting adaptive agents voting on a set of issues represented by vectors. We consider agents who can classify issues into one of two categories and can arrive at their opinions using an adaptive algorithm. Adaptation comes from learning and the information for the learning process comes from interacting with other neighboring agents and trying to change the internal state in order to concur with their opinions. The change in the internal state is driven by the information contained in the issue and in the opinion of the other agent. We present results in a simple yet rich context where each agent uses a Boolean perceptron to state their opinion. If the update occurs with information asynchronously exchanged among pairs of agents, then the typical case, if the number of issues is kept small, is the evolution into a society torn by the emergence of factions with extreme opposite beliefs. This occurs even when seeking consensus with agents with opposite opinions. If the number of issues is large, the dynamics becomes trapped, the society does not evolve into factions and a distribution of moderate opinions is observed. The synchronous case is technically simpler and is studied by formulating the problem in terms of differential equations that describe the evolution of order parameters that measure the consensus between pairs of agents. We show that for a large number of issues and unidirectional information flow, global consensus is a fixed point; however, the approach to this consensus is glassy for large societies.
Resumo:
The reactions induced by the weakly bound (6)Li projectile interacting with the intermediate mass target (59)Co were investigated. Light charged particles singles and alpha-d coincidence measurements were performed at the near barrier energies E(lab) = 17.4, 21.5, 25.5 and 29.6 MeV. The main contributions of the different competing mechanisms are discussed. A statistical model analysis. Continuum-Discretized Coupled-Channels (CDCC) calculations and two-body kinematics were used as tools to provide information to disentangle the main components of these mechanisms. A significant contribution of the direct breakup was observed through the difference between the experimental sequential breakup cross section and the CDCC prediction for the non-capture breakup cross section. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
We review some issues related to the implications of different missing data mechanisms on statistical inference for contingency tables and consider simulation studies to compare the results obtained under such models to those where the units with missing data are disregarded. We confirm that although, in general, analyses under the correct missing at random and missing completely at random models are more efficient even for small sample sizes, there are exceptions where they may not improve the results obtained by ignoring the partially classified data. We show that under the missing not at random (MNAR) model, estimates on the boundary of the parameter space as well as lack of identifiability of the parameters of saturated models may be associated with undesirable asymptotic properties of maximum likelihood estimators and likelihood ratio tests; even in standard cases the bias of the estimators may be low only for very large samples. We also show that the probability of a boundary solution obtained under the correct MNAR model may be large even for large samples and that, consequently, we may not always conclude that a MNAR model is misspecified because the estimate is on the boundary of the parameter space.
Resumo:
Schizophrenia is a disease whose physical cause is unknown despite the attempts of several research teams to discover a physical basis for it. Some success has been gained in genetic studies which indicate that schizophrenia is an inherited disability. However, since research tools are at present so sadly inadequate, the value of pursuing a genetic line of reasoning is questionable. To compensate for the lack of biochemical certainties in treating mental illness, psychological theories have been constructed to explain the schizophrenia syndrome. Normal personality is seen as the resultant of environmental and inherited influences. Involved in the formation of personality are the processes of differentiation and integration, maturation of inherited traits, and the learning processes. As personality develops. consciousness of the self, inferiority feelings, and compensatory mechanisms, and the transformation of interests into drives exert a decided influence upon personality growth. Finally, in the mature personality, an integrating philosophy of life, a large variety of interests, and the possibility of self-objectification become evident.
Resumo:
A presente tese trata do tema redes de cooperação interorganizacionais no contexto brasileiro. O estudo aborda uma política pública governamental desenvolvida no Sul do Brasil voltada a ampliar a competitividade das pequenas empresas e gerar desenvolvimento econômico e social através do incentivo a formação de redes de cooperação entre empresas. O objetivo principal da tese é identificar e compreender os principais fatores que afetam a gestão de redes de cooperação. A partir de uma pesquisa quantitativa realizada em uma amostra de 443 empresas participantes de 120 redes, os resultados evidenciaram os principais elementos de gestão. O Programa Redes de Cooperação, desenvolvido pelo Governo do Estado do Rio Grande do Sul, trata-se de uma política pública que, desde o ano 2000, objetiva o fortalecimento competitivo de pequenas empresas e o desenvolvimento socioeconômico regional. Esse programa sustenta-se em três pilares de atuação: a) uma metodologia de formação, consolidação e expansão de redes entre empresas; b) uma estrutura regionalizada de suporte à implementação formada por uma rede de universidades regionais e c) uma coordenação central por parte do Governo do Estado, responsável pelos instrumentos de promoção, orientação e apoio aos empresários e gestores das redes. Cabe destacar que o caso estudado envolve 120 redes de cooperação, nas quais participam três mil empresas que, juntas, empregam 35.000 pessoas e faturam mais de US$ 1 bilhão. Além disso, a relação próxima com as universidades vem possibilitando uma interação acadêmica em nível nacional que tem gerado avanços teórico-práticos para o fortalecimento da cooperação interorganizacional. Com base nas referências teóricas e em evidências observadas por estudos exploratórios, realizados ex ante no campo de pesquisa, identificaram-se cinco atributos de gestão de redes – mecanismos sociais, aspectos contratuais, motivação e comprometimento, integração com flexibilidade e organização estratégica – e cinco benefícios – ganhos de escala e de poder de mercado, provisão de soluções, aprendizagem e inovação, redução de custos e riscos, e relações sociais. Para confirmação ou não dos dez fatores identificados ex ante e o seu grau de importância, realizou-se uma análise conjunta em uma amostra de 443 proprietários de empresas de uma população de 3.087 associados às 120 redes do programa. Os dados empíricos foram coletados pelo pesquisador em 2005, sendo agregados e processados através do programa estatístico SPSS versão 12.0. Os resultados obtidos pela análise conjunta confirmaram a importância dos dez fatores identificados. Nenhum dos fatores destacou-se significativamente em relação aos demais, o que indica que todos eles têm impacto semelhante na gestão das redes. No campo de estudos sobre redes interorganizacionais, as conclusões da pesquisa contribuíram para uma melhor compreensão dos fatores que influenciam em maior ou menor grau a gestão de redes de cooperação. Demonstraram empiricamente, no caso brasileiro, a coerência de postulados teóricos, desenvolvidos por pesquisas realizadas em outros contextos. No que tange às políticas públicas, os resultados evidenciaram que a promoção da cooperação em redes possibilita ganhos competitivos para as pequenas empresas. No âmbito organizacional, os fatores realçados poderão orientar os gestores nas suas decisões estratégicas no sentido de ampliar os ganhos competitivos da ação em rede.
Resumo:
What can we learn from solar neutrino observations? Is there any solution to the solar neutrino anomaly which is favored by the present experimental panorama? After SNO results, is it possible to affirm that neutrinos have mass? In order to answer such questions we analyze the current available data from the solar neutrino experiments, including the recent SNO result, in view of many acceptable solutions to the solar neutrino problem based on different conversion mechanisms, for the first time using the same statistical procedure. This allows us to do a direct comparison of the goodness of the fit among different solutions, from which we can discuss and conclude on the current status of each proposed dynamical mechanism. These solutions are based on different assumptions: (a) neutrino mass and mixing, (b) a nonvanishing neutrino magnetic moment, (c) the existence of nonstandard flavor-changing and nonuniversal neutrino interactions, and (d) a tiny violation of the equivalence principle. We investigate the quality of the fit provided by each one of these solutions not only to the total rate measured by all the solar neutrino experiments but also to the recoil electron energy spectrum measured at different zenith angles by the Super-Kamiokande Collaboration. We conclude that several nonstandard neutrino flavor conversion mechanisms provide a very good fit to the experimental data which is comparable with (or even slightly better than) the most famous solution to the solar neutrino anomaly based on the neutrino oscillation induced by mass.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The code STATFLUX, implementing a new and simple statistical procedure for the calculation of transfer coefficients in radionuclide transport to animals and plants, is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. Flow parameters were estimated by employing two different least-squares procedures: Derivative and Gauss-Marquardt methods, with the available experimental data of radionuclide concentrations as the input functions of time. The solution of the inverse problem, which relates a given set of flow parameter with the time evolution of concentration functions, is achieved via a Monte Carlo Simulation procedure.Program summaryTitle of program: STATFLUXCatalogue identifier: ADYS_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/ADYS_v1_0Program obtainable from: CPC Program Library, Queen's University of Belfast, N. IrelandLicensing provisions: noneComputer for which the program is designed and others on which it has been tested: Micro-computer with Intel Pentium III, 3.0 GHzInstallation: Laboratory of Linear Accelerator, Department of Experimental Physics, University of São Paulo, BrazilOperating system: Windows 2000 and Windows XPProgramming language used: Fortran-77 as implemented in Microsoft Fortran 4.0. NOTE: Microsoft Fortran includes non-standard features which are used in this program. Standard Fortran compilers such as, g77, f77, ifort and NAG95, are not able to compile the code and therefore it has not been possible for the CPC Program Library to test the program.Memory, required to execute with typical data: 8 Mbytes of RAM memory and 100 MB of Hard disk memoryNo. of bits in a word: 16No. of lines in distributed program, including test data, etc.: 6912No. of bytes in distributed Program, including test data, etc.: 229 541Distribution format: tar.gzNature of the physical problem: the investigation of transport mechanisms for radioactive substances, through environmental pathways, is very important for radiological protection of populations. One such pathway, associated with the food chain, is the grass-animal-man sequence. The distribution of trace elements in humans and laboratory animals has been intensively studied over the past 60 years [R.C. Pendlenton, C.W. Mays, R.D. Lloyd, A.L. Brooks, Differential accumulation of iodine-131 from local fallout in people and milk, Health Phys. 9 (1963) 1253-1262]. In addition, investigations on the incidence of cancer in humans, and a possible causal relationship to radioactive fallout, have been undertaken [E.S. Weiss, M.L. Rallison, W.T. London, W.T. Carlyle Thompson, Thyroid nodularity in southwestern Utah school children exposed to fallout radiation, Amer. J. Public Health 61 (1971) 241-249; M.L. Rallison, B.M. Dobyns, F.R. Keating, J.E. Rall, F.H. Tyler, Thyroid diseases in children, Amer. J. Med. 56 (1974) 457-463; J.L. Lyon, M.R. Klauber, J.W. Gardner, K.S. Udall, Childhood leukemia associated with fallout from nuclear testing, N. Engl. J. Med. 300 (1979) 397-402]. From the pathways of entry of radionuclides in the human (or animal) body, ingestion is the most important because it is closely related to life-long alimentary (or dietary) habits. Those radionuclides which are able to enter the living cells by either metabolic or other processes give rise to localized doses which can be very high. The evaluation of these internally localized doses is of paramount importance for the assessment of radiobiological risks and radiological protection. The time behavior of trace concentration in organs is the principal input for prediction of internal doses after acute or chronic exposure. The General Multiple-Compartment Model (GMCM) is the powerful and more accepted method for biokinetical studies, which allows the calculation of concentration of trace elements in organs as a function of time, when the flow parameters of the model are known. However, few biokinetics data exist in the literature, and the determination of flow and transfer parameters by statistical fitting for each system is an open problem.Restriction on the complexity of the problem: This version of the code works with the constant volume approximation, which is valid for many situations where the biological half-live of a trace is lower than the volume rise time. Another restriction is related to the central flux model. The model considered in the code assumes that exist one central compartment (e.g., blood), that connect the flow with all compartments, and the flow between other compartments is not included.Typical running time: Depends on the choice for calculations. Using the Derivative Method the time is very short (a few minutes) for any number of compartments considered. When the Gauss-Marquardt iterative method is used the calculation time can be approximately 5-6 hours when similar to 15 compartments are considered. (C) 2006 Elsevier B.V. All rights reserved.
Resumo:
In the present work, we propose a model for the statistical distribution of people versus number of steps acquired by them in a learning process, based on competition, learning and natural selection. We consider that learning ability is normally distributed. We found that the number of people versus step acquired by them in a learning process is given through a power law. As competition, learning and selection is also at the core of all economical and social systems, we consider that power-law scaling is a quantitative description of this process in social systems. This gives an alternative thinking in holistic properties of complex systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)