971 resultados para Meta heuristic algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diffusion tensor magnetic resonance imaging, which measures directional information of water diffusion in the brain, has emerged as a powerful tool for human brain studies. In this paper, we introduce a new Monte Carlo-based fiber tracking approach to estimate brain connectivity. One of the main characteristics of this approach is that all parameters of the algorithm are automatically determined at each point using the entropy of the eigenvalues of the diffusion tensor. Experimental results show the good performance of the proposed approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to introduce a diVerent approach, called the ecological-longitudinal, to carrying out pooled analysis in time series ecological studies. Because it gives a larger number of data points and, hence, increases the statistical power of the analysis, this approach, unlike conventional ones, allows the complementation of aspects such as accommodation of random effect models, of lags, of interaction between pollutants and between pollutants and meteorological variables, that are hardly implemented in conventional approaches. Design—The approach is illustrated by providing quantitative estimates of the short-termeVects of air pollution on mortality in three Spanish cities, Barcelona,Valencia and Vigo, for the period 1992–1994. Because the dependent variable was a count, a Poisson generalised linear model was first specified. Several modelling issues are worth mentioning. Firstly, because the relations between mortality and explanatory variables were nonlinear, cubic splines were used for covariate control, leading to a generalised additive model, GAM. Secondly, the effects of the predictors on the response were allowed to occur with some lag. Thirdly, the residual autocorrelation, because of imperfect control, was controlled for by means of an autoregressive Poisson GAM. Finally, the longitudinal design demanded the consideration of the existence of individual heterogeneity, requiring the consideration of mixed models. Main results—The estimates of the relative risks obtained from the individual analyses varied across cities, particularly those associated with sulphur dioxide. The highest relative risks corresponded to black smoke in Valencia. These estimates were higher than those obtained from the ecological-longitudinal analysis. Relative risks estimated from this latter analysis were practically identical across cities, 1.00638 (95% confidence intervals 1.0002, 1.0011) for a black smoke increase of 10 μg/m3 and 1.00415 (95% CI 1.0001, 1.0007) for a increase of 10 μg/m3 of sulphur dioxide. Because the statistical power is higher than in the individual analysis more interactions were statistically significant,especially those among air pollutants and meteorological variables. Conclusions—Air pollutant levels were related to mortality in the three cities of the study, Barcelona, Valencia and Vigo. These results were consistent with similar studies in other cities, with other multicentric studies and coherent with both, previous individual, for each city, and multicentric studies for all three cities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O projeto que se apresenta tem por finalidade investigar processos de aperfeiçoamento da competência de escrita do texto argumentativo de alunos do Ensino Secundário. Considerou-se as contribuições convergentes das teorias e técnicas da psicologia cognitiva e de perspetivas sociais da linguagem. Em consequência, associou-se o modelo de revisão de Hayes e Flower (1983; 1980), e de Hayes, Flower, Schriver, Stratman e Carey (1987), ao modelo de análise do discurso de Bronckart (2004; 1996) e à proposta linguística de Adam (2006; 1992). A hipótese que sustenta a investigação é que a consciência (meta)linguística pode facilitar a revisão textual e o aperfeiçoamento da competência de escrita de textos argumentativos, em contexto de Oficina de Escrita. A investigação conjugou duas fases. Na fase intensiva do estudo de caso, os alunos do 11.º ano de Português de uma Escola do Porto progrediram, através de trabalho continuado em Oficina de Escrita. Após a experiência, mesmo alunos com dificuldades revelaram domínio de metalinguagem, mais consciência (meta)linguística na revisão e autorregulação da sua aprendizagem. Na fase extensiva, foi aplicado um inquérito por questionário a uma amostra probabilística de professores do Secundário, igualmente do Porto. Os docentes valorizaram os aspetos supracitados, bem como a leitura de textos argumentativos e o treino do conhecimento explícito da língua. Contudo, em triangulação, as respostas indiciam um ensino centrado no professor, pouco aberto a projetos de escrita, a novas tecnologias e à divulgação de textos à Escola e ao meio. Os resultados indicam que a competência de escrita de textos argumentativos é passível de aperfeiçoamento em Oficina de Escrita, através de interiorização do género textual e aprofundamento da competência (meta)linguística. No contexto do estudo, comprova-se a influência de um treino de escrita inserido em Projetos de Escola, tendo o aluno como sujeito da sua própria aprendizagem, em interação com o meio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses the auditory brainstem response (ABR) testing for infants.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the results of an investigation which examined the efficacy of a feedback equalization algorithm incorporated into the Central Institute for the Deaf Wearable Digital Hearing Aid. The study examined whether the feedback equalization would allow for greater usable gains when subjects listened to soft speech signals, and if so, whether or not this would improve speech intelligibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is organized in the following way. First I deal with Hardt’s and Negri’s Empire; the second section of the paper focuses on Beck´s World Risk Society; the third main section of this paper tackles the functional differentiation argument posed by Buzan and Albert. By way of conclusion, the final section of this paper briefly discusses alternatives to grand-narratives and master concepts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We run a standard income convergence analysis for the last decade and confirm an already established finding in the growth economics literature. EU countries are converging. Regions in Europe are also converging. But, within countries, regional disparities are on the rise. At the same time, there is probably no reason for EU Cohesion Policy to be concerned with what happens inside countries. Ultimately, our data shows that national governments redistribute well across regions, whether they are fiscally centralised or decentralised. It is difficult to establish if Structural and Cohesion Funds play any role in recent growth convergence patterns in Europe. Generally, macroeconomic simulations produce better results than empirical tests. It is thus possible that Structural Funds do not fully realise their potential either because they are not efficiently allocated or are badly managed or are used for the wrong investments, or a combination of all three. The approach to assess the effectiveness of EU funds should be consistent with the rationale behind the post-1988 EU Cohesion Policy. Standard income convergence analysis is certainly not sufficient and should be accompanied by an assessment of the changes in the efficiency of the capital stock in the recipient countries or regions as well as by a more qualitative assessment. EU funds for competitiveness and employment should be allocated by looking at each region’s capital efficiency to maximise growth generating effects or on a pure competitive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An improved algorithm for the generation of gridded window brightness temperatures is presented. The primary data source is the International Satellite Cloud Climatology Project, level B3 data, covering the period from July 1983 to the present. The algorithm rakes window brightness, temperatures from multiple satellites, both geostationary and polar orbiting, which have already been navigated and normalized radiometrically to the National Oceanic and Atmospheric Administration's Advanced Very High Resolution Radiometer, and generates 3-hourly global images on a 0.5 degrees by 0.5 degrees latitude-longitude grid. The gridding uses a hierarchical scheme based on spherical kernel estimators. As part of the gridding procedure, the geostationary data are corrected for limb effects using a simple empirical correction to the radiances, from which the corrected temperatures are computed. This is in addition to the application of satellite zenith angle weighting to downweight limb pixels in preference to nearer-nadir pixels. The polar orbiter data are windowed on the target time with temporal weighting to account for the noncontemporaneous nature of the data. Large regions of missing data are interpolated from adjacent processed images using a form of motion compensated interpolation based on the estimation of motion vectors using an hierarchical block matching scheme. Examples are shown of the various stages in the process. Also shown are examples of the usefulness of this type of data in GCM validation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern methods of spawning new technological motifs are not appropriate when it is desired to realize artificial life as an actual real world entity unto itself (Pattee 1995; Brooks 2006; Chalmers 1995). Many fundamental aspects of such a machine are absent in common methods, which generally lack methodologies of construction. In this paper we mix classical and modern studies in order to attempt to realize an artificial life form from first principles. A model of an algorithm is introduced, its methodology of construction is presented, and the fundamental source from which it sprang is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algorithm is presented for the generation of molecular models of defective graphene fragments, containing a majority of 6-membered rings with a small number of 5- and 7-membered rings as defects. The structures are generated from an initial random array of points in 2D space, which are then subject to Delaunay triangulation. The dual of the triangulation forms a Voronoi tessellation of polygons with a range of ring sizes. An iterative cycle of refinement, involving deletion and addition of points followed by further triangulation, is performed until the user-defined criteria for the number of defects are met. The array of points and connectivities are then converted to a molecular structure and subject to geometry optimization using a standard molecular modeling package to generate final atomic coordinates. On the basis of molecular mechanics with minimization, this automated method can generate structures, which conform to user-supplied criteria and avoid the potential bias associated with the manual building of structures. One application of the algorithm is the generation of structures for the evaluation of the reactivity of different defect sites. Ab initio electronic structure calculations on a representative structure indicate preferential fluorination close to 5-ring defects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a novel numerical algorithm for simulating the evolution of fine-scale conservative fields in layer-wise two-dimensional flows, the most important examples of which are the earth's atmosphere and oceans. the algorithm combines two radically different algorithms, one Lagrangian and the other Eulerian, to achieve an unexpected gain in computational efficiency. The algorithm is demonstrated for multi-layer quasi-geostrophic flow, and results are presented for a simulation of a tilted stratospheric polar vortex and of nearly-inviscid quasi-geostrophic turbulence. the turbulence results contradict previous arguments and simulation results that have suggested an ultimate two-dimensional, vertically-coherent character of the flow. Ongoing extensions of the algorithm to the generally ageostrophic flows characteristic of planetary fluid dynamics are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Active queue management (AQM) policies are those policies of router queue management that allow for the detection of network congestion, the notification of such occurrences to the hosts on the network borders, and the adoption of a suitable control policy. This paper proposes the adoption of a fuzzy proportional integral (FPI) controller as an active queue manager for Internet routers. The analytical design of the proposed FPI controller is carried out in analogy with a proportional integral (PI) controller, which recently has been proposed for AQM. A genetic algorithm is proposed for tuning of the FPI controller parameters with respect to optimal disturbance rejection. In the paper the FPI controller design metodology is described and the results of the comparison with random early detection (RED), tail drop, and PI controller are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a distributed computing framework for problems characterized by a highly irregular search tree, whereby no reliable workload prediction is available. The framework is based on a peer-to-peer computing environment and dynamic load balancing. The system allows for dynamic resource aggregation, does not depend on any specific meta-computing middleware and is suitable for large-scale, multi-domain, heterogeneous environments, such as computational Grids. Dynamic load balancing policies based on global statistics are known to provide optimal load balancing performance, while randomized techniques provide high scalability. The proposed method combines both advantages and adopts distributed job-pools and a randomized polling technique. The framework has been successfully adopted in a parallel search algorithm for subgraph mining and evaluated on a molecular compounds dataset. The parallel application has shown good calability and close-to linear speedup in a distributed network of workstations.