845 resultados para Heterogeneous class
Resumo:
Le trouble du déficit de l’attention avec ou sans hyperactivité (TDAH) est de plus en plus reconnu chez l'adulte. Les psychostimulants représentent la première ligne de traitement, mais ceux-ci ne sont parfois pas tolérés, peuvent être contrindiqués ou ne pas être efficaces. Les médicaments non stimulants constituent une alternative mais ont été insuffisamment explorés. Cette thèse présente un essai clinique randomisé contrôlé de 30 sujets souffrant de TDAH qui ont reçu soit la duloxétine 60 mg par jour ou le placebo pendant une période de 6 semaines. Le Conners’ Adult ADHD Rating Scale (CAARS) et le Clinical Global Impression scale (CGI) ont été utilisés pour mesurer la sévérité des symptômes et l'amélioration clinique. Le Hamilton Anxiety Rating Scale (HARS) et le Hamilton Depression Rating Scale (HDRS) ont été choisis pour vérifier l'impact sur la symptomatologie anxio-dépressive. Les résultats démontrent que les sujets ayant reçu la duloxétine avait un score au CGI-Severity (CGI-S) inférieur au groupe contrôle à 6 semaines de traitement et une amélioration plus importante au CGI-Improvement (CGI-I). Ce groupe démontre également des diminutions supérieures des scores à plusieurs sous-échelles du CAARS. Aucun effet n'a été observé sur le HARS et le HDRS. Le taux de retrait du bras duloxetine remet par contre en question la dose initiale choisie dans ce protocole. La duloxétine semble donc une option prometteuse dans le traitement du TDAH chez l'adulte et la réplication des données cliniques serait la prochaine étape pour confirmer ces résultats.
Resumo:
O presente trabalho constitui um projecto de investigação acção realizado numa escola de primeiro ciclo do distrito de Lisboa, mais especificamente numa truma do 4º ano de escolaridade,desencadeado por dois alunos diagnosticados com dificuldades intelectuais e desenvolvimentais que integravam uma unidade de apoio e specializado a alunos com multideficiência e surdocegueira congénita . Paralelamente foram também mobilizados os docentes do núcleo de educação especial do agrupamento de escolas . A caraterização destes três contextos a turma, a unidade e o núcleo- realizou-se através da utilização dos seguintei instrumentos de recolha de dados: análise documental, observação naturalista, entrevista semi-estruturada e sociometria. A análise dos mesmos permitiu fazer as seguintes constatações : turma bastante heterogénea, diferentes níveis de aprendizagem, alunos de língua portuguesa não materna, de diferentes etnias, e nível socioeconómico, problemas de relacionamento entre eles e alguma resistência em aceitar os colegas considerados com dificuldades intelectuais e desenvolvimentais; passividade e pouca responsabilidade no ato de aprender; dificuldades no cumprimento de regras; participação escassa das famílias no processo de ensino/aprendizagem; um tipo de ensino preferencialmente expositivo ; trabalho essencialmente individualizado , por parte dos alunos; dificuldade em incluir aqueles cujas diferenças individuais são mais significativas - diferenciação pedagógica descontextualizada . Na unidade de apoio especializado verificou –se a necessidade de um trabalho mais cooperado e convergente entre todos os intervenientes . No grupo de docentes de educação especial, constatou-se que as práticas de apoio aos sessenta alunos considerados com necessidades educativas especiais de caráter permanente eram essencialmente centradas no apoio direto ao aluno, fora da sala de aula .Tendo como quadro concetual de referência o paradigma da diversidade,a inclusão, a escola e educação inclusivas, a diferenciação pedagógica inclusiva, a aprendizagem cooperativa, e , tal como já se referiu, uma abordagem assente nas permissas da investigação ação como um processo cíclico de refletir para agir e refletir sobre a acção , desenvolveram –se ações nos referidos contextos da intervenção. Turma–diferenciação pedagógica inclusiva, utilização de metodologias de ensino/aprendizagem cooperativa na área da língua portuguesa Na unidade de apoio especializado –intervenção individualizada com os alunos “caso” para O desenvolvimento das competências estipuladas nos seus currículos específicos individuais, tendo por base um trabalho coordenado e cooperativo entre professores, famílias, assistentes operacionais e técnica de terapia da fala. No núcleo de educação especial – ação de sensibilização sobre a importância do papel de parceria pedagógica com os docentes do ensino regular , para implementação de uma pedagogia diferenciada inclusiva e da aprendizagem cooperativa como estratégias de inclusão . Considerando a complexidade dos contextos de intervenção e a competência profissional tão necessária ao sucesso de todos os alunos e à melhoria da escola , com a realização deste projeto de investigação acção pelo enriquecimento profissional que proporcionou, pôde constatar-se que este tipo de abordagem (o professor como investigador crítico e reflexivo), se apresenta de facto, como uma via importantíssimade formação contínua. Neste caso, o desenvolvimento deste projeto, permitiu encontrar algumas respostas que contribuíram para a melhoria profissional dos implicados e dos contextos em questão, atenuando os problemas identificados e perspetivar outras respostas . No entanto, esta abordagem não foi um processo fácil, pois esta capacidade de fazer juízos críticos sobre o próprio trabalho requer tempo de aprendizagem e principalmente a vontade de querer fazer, sempre, mais e melhor.
Resumo:
The interaction of reducing sugars, such as aldose, with proteins and the subsequent molecular rearrangements, produces irreversible advanced glycation end-products (AGEs), a heterogeneous class of non-enzymatic glycated proteins or lipids. AGEs form cross-links, trap macromolecules and release reactive oxygen intermediates. AGEs are linked to aging, and increase in several related diseases. The aim of this study was to assess, in a murine macrophage cell line, J774A.1, the effects of 48 h of exposure to glycated serum containing a known amount of pentosidine, a well-known AGE found in the plasma and tissues of diabetic and uremic subjects. Fetal bovine serum was incubated with ribose (50 mm) for 7 days at 37 °C to obtain about 10 nmol/ml of pentosidine. The cytotoxic parameters studied were cell morphology and viability by neutral red uptake, lactate dehydrogenase release and tetrazolium salt test. In the medium and in the intracellular compartment, bound and free pentosidine were evaluated by HPLC, as sensitive and specific glycative markers, and thiobarbituric acid reactive substances (TBARs), as index of the extent of lipid peroxidation. Our results confirm that macrophages are able to take up pentosidine. It is conceivable that bound pentosidine is degraded and free pentosidine is released inside the cell and then into the medium. The AGE increase in the medium was combined with an increase in TBARs, meaning that an oxidative stress occurred; marked cytotoxic effects were observed, and were followed by the release of free pentosidine and TBARs into the culture medium.
Resumo:
Rezension von: Ingvelde Scholz: Das heterogene Klassenzimmer, Differenziert unterrichten, Göttingen: Vandenhoeck & Ruprecht 2012 (144 S.; ISBN 978-3-525-70133-1)
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
[cat] En aquest treball s'analitza un model estocàstic en temps continu en el que l'agent decisor descompta les utilitats instantànies i la funció final amb taxes de preferència temporal constants però diferents. En aquest context es poden modelitzar problemes en els quals, quan el temps s'acosta al moment final, la valoració de la funció final incrementa en comparació amb les utilitats instantànies. Aquest tipus d'asimetria no es pot descriure ni amb un descompte estàndard ni amb un variable. Per tal d'obtenir solucions consistents temporalment es deriva l'equació de programació dinàmica estocàstica, les solucions de la qual són equilibris Markovians. Per a aquest tipus de preferències temporals, s'estudia el model clàssic de consum i inversió (Merton, 1971) per a les funcions d'utilitat del tipus CRRA i CARA, comparant els equilibris Markovians amb les solucions inconsistents temporalment. Finalment es discuteix la introducció del temps final aleatori.
Resumo:
[cat] En aquest treball s'analitza un model estocàstic en temps continu en el que l'agent decisor descompta les utilitats instantànies i la funció final amb taxes de preferència temporal constants però diferents. En aquest context es poden modelitzar problemes en els quals, quan el temps s'acosta al moment final, la valoració de la funció final incrementa en comparació amb les utilitats instantànies. Aquest tipus d'asimetria no es pot descriure ni amb un descompte estàndard ni amb un variable. Per tal d'obtenir solucions consistents temporalment es deriva l'equació de programació dinàmica estocàstica, les solucions de la qual són equilibris Markovians. Per a aquest tipus de preferències temporals, s'estudia el model clàssic de consum i inversió (Merton, 1971) per a les funcions d'utilitat del tipus CRRA i CARA, comparant els equilibris Markovians amb les solucions inconsistents temporalment. Finalment es discuteix la introducció del temps final aleatori.
Resumo:
[cat] En aquest treball s'analitza l'efecte que comporta l'introducció de preferències inconsistents temporalment sobre les decisions òptimes de consum, inversió i compra d'assegurança de vida. En concret, es pretén recollir la creixent importància que un individu dóna a la herència que deixa i a la riquesa disponible per a la seva jubilació al llarg de la seva vida laboral. Amb aquesta finalitat, es parteix d'un model estocàstic en temps continu amb temps final aleatori, i s'introdueix el descompte heterogeni, considerant un agent amb una distribució de vida residual coneguda. Per tal d'obtenir solucions consistents temporalment es resol una equació de programació dinàmica no estàndard. Per al cas de funcions d'utilitat del tipus CRRA i CARA es troben solucions explícites. Finalment, els resultats obtinguts s'il·lustren numèricament.
Resumo:
[cat] En aquest treball s'analitza l'efecte que comporta l'introducció de preferències inconsistents temporalment sobre les decisions òptimes de consum, inversió i compra d'assegurança de vida. En concret, es pretén recollir la creixent importància que un individu dóna a la herència que deixa i a la riquesa disponible per a la seva jubilació al llarg de la seva vida laboral. Amb aquesta finalitat, es parteix d'un model estocàstic en temps continu amb temps final aleatori, i s'introdueix el descompte heterogeni, considerant un agent amb una distribució de vida residual coneguda. Per tal d'obtenir solucions consistents temporalment es resol una equació de programació dinàmica no estàndard. Per al cas de funcions d'utilitat del tipus CRRA i CARA es troben solucions explícites. Finalment, els resultats obtinguts s'il·lustren numèricament.
Resumo:
Presently, conditions ensuring the validity of bootstrap methods for the sample mean of (possibly heterogeneous) near epoch dependent (NED) functions of mixing processes are unknown. Here we establish the validity of the bootstrap in this context, extending the applicability of bootstrap methods to a class of processes broadly relevant for applications in economics and finance. Our results apply to two block bootstrap methods: the moving blocks bootstrap of Künsch ( 989) and Liu and Singh ( 992), and the stationary bootstrap of Politis and Romano ( 994). In particular, the consistency of the bootstrap variance estimator for the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. The first order asymptotic validity of the bootstrap approximation to the actual distribution of the sample mean is also established in this heterogeneous NED context.
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred
Resumo:
This thesis explores the capabilities of heterogeneous multi-core systems, based on multiple Graphics Processing Units (GPUs) in a standard desktop framework. Multi-GPU accelerated desk side computers are an appealing alternative to other high performance computing (HPC) systems: being composed of commodity hardware components fabricated in large quantities, their price-performance ratio is unparalleled in the world of high performance computing. Essentially bringing “supercomputing to the masses”, this opens up new possibilities for application fields where investing in HPC resources had been considered unfeasible before. One of these is the field of bioelectrical imaging, a class of medical imaging technologies that occupy a low-cost niche next to million-dollar systems like functional Magnetic Resonance Imaging (fMRI). In the scope of this work, several computational challenges encountered in bioelectrical imaging are tackled with this new kind of computing resource, striving to help these methods approach their true potential. Specifically, the following main contributions were made: Firstly, a novel dual-GPU implementation of parallel triangular matrix inversion (TMI) is presented, addressing an crucial kernel in computation of multi-mesh head models of encephalographic (EEG) source localization. This includes not only a highly efficient implementation of the routine itself achieving excellent speedups versus an optimized CPU implementation, but also a novel GPU-friendly compressed storage scheme for triangular matrices. Secondly, a scalable multi-GPU solver for non-hermitian linear systems was implemented. It is integrated into a simulation environment for electrical impedance tomography (EIT) that requires frequent solution of complex systems with millions of unknowns, a task that this solution can perform within seconds. In terms of computational throughput, it outperforms not only an highly optimized multi-CPU reference, but related GPU-based work as well. Finally, a GPU-accelerated graphical EEG real-time source localization software was implemented. Thanks to acceleration, it can meet real-time requirements in unpreceeded anatomical detail running more complex localization algorithms. Additionally, a novel implementation to extract anatomical priors from static Magnetic Resonance (MR) scansions has been included.
Resumo:
We propose a novel class of models for functional data exhibiting skewness or other shape characteristics that vary with spatial or temporal location. We use copulas so that the marginal distributions and the dependence structure can be modeled independently. Dependence is modeled with a Gaussian or t-copula, so that there is an underlying latent Gaussian process. We model the marginal distributions using the skew t family. The mean, variance, and shape parameters are modeled nonparametrically as functions of location. A computationally tractable inferential framework for estimating heterogeneous asymmetric or heavy-tailed marginal distributions is introduced. This framework provides a new set of tools for increasingly complex data collected in medical and public health studies. Our methods were motivated by and are illustrated with a state-of-the-art study of neuronal tracts in multiple sclerosis patients and healthy controls. Using the tools we have developed, we were able to find those locations along the tract most affected by the disease. However, our methods are general and highly relevant to many functional data sets. In addition to the application to one-dimensional tract profiles illustrated here, higher-dimensional extensions of the methodology could have direct applications to other biological data including functional and structural MRI.
Resumo:
Airway disease in childhood comprises a heterogeneous group of disorders. Attempts to distinguish different phenotypes have generally considered few disease dimensions. The present study examines phenotypes of childhood wheeze and chronic cough, by fitting a statistical model to data representing multiple disease dimensions. From a population-based, longitudinal cohort study of 1,650 preschool children, 319 with parent-reported wheeze or chronic cough were included. Phenotypes were identified by latent class analysis using data on symptoms, skin-prick tests, lung function and airway responsiveness from two preschool surveys. These phenotypes were then compared with respect to outcome at school age. The model distinguished three phenotypes of wheeze and two phenotypes of chronic cough. Subsequent wheeze, chronic cough and inhaler use at school age differed clearly between the five phenotypes. The wheeze phenotypes shared features with previously described entities and partly reconciled discrepancies between existing sets of phenotype labels. This novel, multidimensional approach has the potential to identify clinically relevant phenotypes, not only in paediatric disorders but also in adult obstructive airway diseases, where phenotype definition is an equally important issue.
Resumo:
Heterogeneous materials are ubiquitous in nature and as synthetic materials. These materials provide unique combination of desirable mechanical properties emerging from its heterogeneities at different length scales. Future structural and technological applications will require the development of advanced light weight materials with superior strength and toughness. Cost effective design of the advanced high performance synthetic materials by tailoring their microstructure is the challenge facing the materials design community. Prior knowledge of structure-property relationships for these materials is imperative for optimal design. Thus, understanding such relationships for heterogeneous materials is of primary interest. Furthermore, computational burden is becoming critical concern in several areas of heterogeneous materials design. Therefore, computationally efficient and accurate predictive tools are highly essential. In the present study, we mainly focus on mechanical behavior of soft cellular materials and tough biological material such as mussel byssus thread. Cellular materials exhibit microstructural heterogeneity by interconnected network of same material phase. However, mussel byssus thread comprises of two distinct material phases. A robust numerical framework is developed to investigate the micromechanisms behind the macroscopic response of both of these materials. Using this framework, effect of microstuctural parameters has been addressed on the stress state of cellular specimens during split Hopkinson pressure bar test. A voronoi tessellation based algorithm has been developed to simulate the cellular microstructure. Micromechanisms (microinertia, microbuckling and microbending) governing macroscopic behavior of cellular solids are investigated thoroughly with respect to various microstructural and loading parameters. To understand the origin of high toughness of mussel byssus thread, a Genetic Algorithm (GA) based optimization framework has been developed. It is found that two different material phases (collagens) of mussel byssus thread are optimally distributed along the thread. These applications demonstrate that the presence of heterogeneity in the system demands high computational resources for simulation and modeling. Thus, Higher Dimensional Model Representation (HDMR) based surrogate modeling concept has been proposed to reduce computational complexity. The applicability of such methodology has been demonstrated in failure envelope construction and in multiscale finite element techniques. It is observed that surrogate based model can capture the behavior of complex material systems with sufficient accuracy. The computational algorithms presented in this thesis will further pave the way for accurate prediction of macroscopic deformation behavior of various class of advanced materials from their measurable microstructural features at a reasonable computational cost.