937 resultados para key features
Resumo:
El concepto de algoritmo es básico en informática, por lo que es crucial que los alumnos profundicen en él desde el inicio de su formación. Por tanto, contar con una herramienta que guíe a los estudiantes en su aprendizaje puede suponer una gran ayuda en su formación. La mayoría de los autores coinciden en que, para determinar la eficacia de una herramienta de visualización de algoritmos, es esencial cómo se utiliza. Así, los estudiantes que participan activamente en la visualización superan claramente a los que la contemplan de forma pasiva. Por ello, pensamos que uno de los mejores ejercicios para un alumno consiste en simular la ejecución del algoritmo que desea aprender mediante el uso de una herramienta de visualización, i. e. consiste en realizar una simulación visual de dicho algoritmo. La primera parte de esta tesis presenta los resultados de una profunda investigación sobre las características que debe reunir una herramienta de ayuda al aprendizaje de algoritmos y conceptos matemáticos para optimizar su efectividad: el conjunto de especificaciones eMathTeacher, además de un entorno de aprendizaje que integra herramientas que las cumplen: GRAPHs. Hemos estudiado cuáles son las cualidades esenciales para potenciar la eficacia de un sistema e-learning de este tipo. Esto nos ha llevado a la definición del concepto eMathTeacher, que se ha materializado en el conjunto de especificaciones eMathTeacher. Una herramienta e-learning cumple las especificaciones eMathTeacher si actúa como un profesor virtual de matemáticas, i. e. si es una herramienta de autoevaluación que ayuda a los alumnos a aprender de forma activa y autónoma conceptos o algoritmos matemáticos, corrigiendo sus errores y proporcionando pistas para encontrar la respuesta correcta, pero sin dársela explícitamente. En estas herramientas, la simulación del algoritmo no continúa hasta que el usuario introduce la respuesta correcta. Para poder reunir en un único entorno una colección de herramientas que cumplan las especificaciones eMathTeacher hemos creado GRAPHs, un entorno ampliable, basado en simulación visual, diseñado para el aprendizaje activo e independiente de los algoritmos de grafos y creado para que en él se integren simuladores de diferentes algoritmos. Además de las opciones de creación y edición del grafo y la visualización de los cambios producidos en él durante la simulación, el entorno incluye corrección paso a paso, animación del pseudocódigo del algoritmo, preguntas emergentes, manejo de las estructuras de datos del algoritmo y creación de un log de interacción en XML. Otro problema que nos planteamos en este trabajo, por su importancia en el proceso de aprendizaje, es el de la evaluación formativa. El uso de ciertos entornos e-learning genera gran cantidad de datos que deben ser interpretados para llegar a una evaluación que no se limite a un recuento de errores. Esto incluye el establecimiento de relaciones entre los datos disponibles y la generación de descripciones lingüísticas que informen al alumno sobre la evolución de su aprendizaje. Hasta ahora sólo un experto humano era capaz de hacer este tipo de evaluación. Nuestro objetivo ha sido crear un modelo computacional que simule el razonamiento del profesor y genere un informe sobre la evolución del aprendizaje que especifique el nivel de logro de cada uno de los objetivos definidos por el profesor. Como resultado del trabajo realizado, la segunda parte de esta tesis presenta el modelo granular lingüístico de la evaluación del aprendizaje, capaz de modelizar la evaluación y generar automáticamente informes de evaluación formativa. Este modelo es una particularización del modelo granular lingüístico de un fenómeno (GLMP), en cuyo desarrollo y formalización colaboramos, basado en la lógica borrosa y en la teoría computacional de las percepciones. Esta técnica, que utiliza sistemas de inferencia basados en reglas lingüísticas y es capaz de implementar criterios de evaluación complejos, se ha aplicado a dos casos: la evaluación, basada en criterios, de logs de interacción generados por GRAPHs y de cuestionarios de Moodle. Como consecuencia, se han implementado, probado y utilizado en el aula sistemas expertos que evalúan ambos tipos de ejercicios. Además de la calificación numérica, los sistemas generan informes de evaluación, en lenguaje natural, sobre los niveles de competencia alcanzados, usando sólo datos objetivos de respuestas correctas e incorrectas. Además, se han desarrollado dos aplicaciones capaces de ser configuradas para implementar los sistemas expertos mencionados. Una procesa los archivos producidos por GRAPHs y la otra, integrable en Moodle, evalúa basándose en los resultados de los cuestionarios. ABSTRACT The concept of algorithm is one of the core subjects in computer science. It is extremely important, then, for students to get a good grasp of this concept from the very start of their training. In this respect, having a tool that helps and shepherds students through the process of learning this concept can make a huge difference to their instruction. Much has been written about how helpful algorithm visualization tools can be. Most authors agree that the most important part of the learning process is how students use the visualization tool. Learners who are actively involved in visualization consistently outperform other learners who view the algorithms passively. Therefore we think that one of the best exercises to learn an algorithm is for the user to simulate the algorithm execution while using a visualization tool, thus performing a visual algorithm simulation. The first part of this thesis presents the eMathTeacher set of requirements together with an eMathTeacher-compliant tool called GRAPHs. For some years, we have been developing a theory about what the key features of an effective e-learning system for teaching mathematical concepts and algorithms are. This led to the definition of eMathTeacher concept, which has materialized in the eMathTeacher set of requirements. An e-learning tool is eMathTeacher compliant if it works as a virtual math trainer. In other words, it has to be an on-line self-assessment tool that helps students to actively and autonomously learn math concepts or algorithms, correcting their mistakes and providing them with clues to find the right answer. In an eMathTeacher-compliant tool, algorithm simulation does not continue until the user enters the correct answer. GRAPHs is an extendible environment designed for active and independent visual simulation-based learning of graph algorithms, set up to integrate tools to help the user simulate the execution of different algorithms. Apart from the options of creating and editing the graph, and visualizing the changes made to the graph during simulation, the environment also includes step-by-step correction, algorithm pseudo-code animation, pop-up questions, data structure handling and XML-based interaction log creation features. On the other hand, assessment is a key part of any learning process. Through the use of e-learning environments huge amounts of data can be output about this process. Nevertheless, this information has to be interpreted and represented in a practical way to arrive at a sound assessment that is not confined to merely counting mistakes. This includes establishing relationships between the available data and also providing instructive linguistic descriptions about learning evolution. Additionally, formative assessment should specify the level of attainment of the learning goals defined by the instructor. Till now, only human experts were capable of making such assessments. While facing this problem, our goal has been to create a computational model that simulates the instructor’s reasoning and generates an enlightening learning evolution report in natural language. The second part of this thesis presents the granular linguistic model of learning assessment to model the assessment of the learning process and implement the automated generation of a formative assessment report. The model is a particularization of the granular linguistic model of a phenomenon (GLMP) paradigm, based on fuzzy logic and the computational theory of perceptions, to the assessment phenomenon. This technique, useful for implementing complex assessment criteria using inference systems based on linguistic rules, has been applied to two particular cases: the assessment of the interaction logs generated by GRAPHs and the criterion-based assessment of Moodle quizzes. As a consequence, several expert systems to assess different algorithm simulations and Moodle quizzes have been implemented, tested and used in the classroom. Apart from the grade, the designed expert systems also generate natural language progress reports on the achieved proficiency level, based exclusively on the objective data gathered from correct and incorrect responses. In addition, two applications, capable of being configured to implement the expert systems, have been developed. One is geared up to process the files output by GRAPHs and the other one is a Moodle plug-in set up to perform the assessment based on the quizzes results.
Resumo:
El empleo de refuerzos de FRP en vigas de hormigón armado es cada vez más frecuente por sus numerosas ventajas frente a otros métodos más tradicionales. Durante los últimos años, la técnica FRP-NSM, consistente en introducir barras de FRP sobre el recubrimiento de una viga de hormigón, se ha posicionado como uno de los mejores métodos de refuerzo y rehabilitación de estructuras de hormigón armado, tanto por su facilidad de montaje y mantenimiento, como por su rendimiento para aumentar la capacidad resistente de dichas estructuras. Si bien el refuerzo a flexión ha sido ampliamente desarrollado y estudiado hasta la fecha, no sucede lo mismo con el refuerzo a cortante, debido principalmente a su gran complejidad. Sin embargo, se debería dedicar más estudio a este tipo de refuerzo si se pretenden conservar los criterios de diseño en estructuras de hormigón armado, los cuales están basados en evitar el fallo a cortante por sus consecuencias catastróficas Esta ausencia de información y de normativa es la que justifica esta tesis doctoral. En este pro-yecto se van a desarrollar dos metodologías alternativas, que permiten estimar la capacidad resistente de vigas de hormigón armado, reforzadas a cortante mediante la técnica FRP-NSM. El primer método aplicado consiste en la implementación de una red neuronal artificial capaz de predecir adecuadamente la resistencia a cortante de vigas reforzadas con este método a partir de experimentos anteriores. Asimismo, a partir de la red se han llevado a cabo algunos estudios a fin de comprender mejor la influencia real de algunos parámetros de la viga y del refuerzo sobre la resistencia a cortante con el propósito de lograr diseños más seguros de este tipo de refuerzo. Una configuración óptima de la red requiere discriminar adecuadamente de entre los numerosos parámetros (geométricos y de material) que pueden influir en el compor-tamiento resistente de la viga, para lo cual se han llevado a cabo diversos estudios y pruebas. Mediante el segundo método, se desarrolla una ecuación de proyecto que permite, de forma sencilla, estimar la capacidad de vigas reforzadas a cortante con FRP-NSM, la cual podría ser propuesta para las principales guías de diseño. Para alcanzar este objetivo, se plantea un pro-blema de optimización multiobjetivo a partir de resultados de ensayos experimentales llevados a cabo sobre vigas de hormigón armado con y sin refuerzo de FRP. El problema multiobjetivo se resuelve mediante algoritmos genéticos, en concreto el algoritmo NSGA-II, por ser más apropiado para problemas con varias funciones objetivo que los métodos de optimización clásicos. Mediante una comparativa de las predicciones realizadas con ambos métodos y de los resulta-dos de ensayos experimentales se podrán establecer las ventajas e inconvenientes derivadas de la aplicación de cada una de las dos metodologías. Asimismo, se llevará a cabo un análisis paramétrico con ambos enfoques a fin de intentar determinar la sensibilidad de aquellos pa-rámetros más sensibles a este tipo de refuerzo. Finalmente, se realizará un análisis estadístico de la fiabilidad de las ecuaciones de diseño deri-vadas de la optimización multiobjetivo. Con dicho análisis se puede estimar la capacidad resis-tente de una viga reforzada a cortante con FRP-NSM dentro de un margen de seguridad espe-cificado a priori. ABSTRACT The use of externally bonded (EB) fibre-reinforced polymer (FRP) composites has gained acceptance during the last two decades in the construction engineering community, particularly in the rehabilitation of reinforced concrete (RC) structures. Currently, to increase the shear resistance of RC beams, FRP sheets are externally bonded (EB-FRP) and applied on the external side surface of the beams to be strengthened with different configurations. Of more recent application, the near-surface mounted FRP bar (NSM-FRP) method is another technique successfully used to increase the shear resistance of RC beams. In the NSM method, FRP rods are embedded into grooves intentionally prepared in the concrete cover of the side faces of RC beams. While flexural strengthening has been widely developed and studied so far, the same doesn´t occur to shearing strength mainly due to its great complexity. Nevertheless, if design criteria are to be preserved more research should be done to this sort of strength, which are based on avoiding shear failure and its catastrophic consequences. However, in spite of this, accurately calculating the shear capacity of FRP shear strengthened RC beams remains a complex challenge that has not yet been fully resolved due to the numerous variables involved in the procedure. The objective of this Thesis is to develop methodologies to evaluate the capacity of FRP shear strengthened RC beams by dealing with the problem from a different point of view to the numerical modeling approach by using artificial intelligence techniques. With this purpose two different approaches have been developed: one concerned with the use of artificial neural networks and the other based on the implementation of an optimization approach developed jointly with the use of artificial neural networks (ANNs) and solved with genetic algorithms (GAs). With these approaches some of the difficulties concerned regarding the numerical modeling can be overcome. As an alternative tool to conventional numerical techniques, neural networks do not provide closed form solutions for modeling problems but do, however, offer a complex and accurate solution based on a representative set of historical examples of the relationship. Furthermore, they can adapt solutions over time to include new data. On the other hand, as a second proposal, an optimization approach has also been developed to implement simple yet accurate shear design equations for this kind of strengthening. This approach is developed in a multi-objective framework by considering experimental results of RC beams with and without NSM-FRP. Furthermore, the results obtained with the previous scheme based on ANNs are also used as a filter to choose the parameters to include in the design equations. Genetic algorithms are used to solve the optimization problem since they are especially suitable for solving multi-objective problems when compared to standard optimization methods. The key features of the two proposed procedures are outlined and their performance in predicting the capacity of NSM-FRP shear strengthened RC beams is evaluated by comparison with results from experimental tests and with predictions obtained using a simplified numerical model. A sensitivity study of the predictions of both models for the input parameters is also carried out.
Resumo:
Este proyecto ha sido estructurado en tres bloques diferenciados que tiene como objetivo analizar el presente, pasado y futuro de la gestión de proyectos con una orientación clara a ofrecer apoyo a aquellas empresas que planeen realizar un giro en sus estructura de organización de trabajo hacia filosofías ágiles. La gestión de proyectos a acompañado al hombre desde el inicio de los tiempos, pero sin embargo no fue hasta nuestra historia más cercana cuando se comienza a tener conciencia de la necesidad de establecer aplicar métodos generales a la hora de afrontar proyectos de ingeniería. Pioneros fueron en este sentido personajes como Taylor, Fayol o Gantt, cuyas aportaciones iniciales posibilitaron el nacimiento de la gestión de proyectos a mediados del siglo XX. En las década sucesivas hasta nuestros días han aparecido un número considerable de metodologías, perfeccionando elementos anteriormente asimilados y adaptándose a la realidad tecnológica y social del momento. Este recorrido histórico se aborda en el primer bloque de este proyecto. Los puntos revisados en la introducción histórica proporcionan las bases de conocimiento para entender las principales características entre las grandes familias de metodologías que actualmente son implantadas en el mundo empresarial. Es precisamente este punto, el de la exposición y análisis de las metodologías más extendidas y contemporáneas a la elaboración de este proyecto, lo que se aborda en el segundo bloque. En él se desarrolla con intenso detalle dos metodologías de la familia de filosofías ágiles, elegidas entre las más extendidas en la industria: Scrum, eXtreme Programming (XP) y un híbrido de ambas. Para entender la dimensión del cambio filosófico en cuanto a la gestión de proyectos que emprenden estas metodologías, se hace necesario repasar cuales son las metodologías no ágiles más extendidas actualmente. Para ello se introducen metodologías como la llamada tradicional o PRINCE2, principalmente, ya que también se realiza un repaso más superficial por otras aproximaciones notables. El último bloque que construye el desenlace del proyecto, intenta responder a las cuestiones futuras relacionadas con la adopción de metodologías ágiles. Para ello se revisa los puntos más conflictivos a señalar en este proceso y se ofrecerán soluciones teóricas y prácticas para ayudar a las empresas en su fase de migración hacia filosofías de organización ágiles. Ya que toda empresa hoy en día debería tener un soporte tecnológico que ofrezca apoyo a su desarrollo empresarial, gran parte de la elaboración de este proyecto se ha dedicado a hacer un estudio comparativo de las actuales herramientas Open Source. Se han instalado y probado 25 herramientas para posteriormente seleccionar 3 que han sido analizadas en profundidad. Así mismo, se enumeran pros y contras de estas herramientas, aportando ideas de mejoras y trazando cuál debería ser su evolución para ofrecer una alternativa real a las herramientas comerciales orientadas a esta labor. Las empresas pueden utilizar este índice de herramientas de software para decidir si pueden sustentar su proceso de migración en una herramienta Open Source o si deben orientarse a herramientas comerciales o de creación propias. ABSTRACT. This project has been divided into three different blocks that aims to analyze the past, present and future of project management with a clear focus on providing support to those companies that are planning to make a shift in its organizational structure working towards agile philosophies . The project management has walked together with the humanity since the beginning of time , but it was on our recent history when it begins to become aware of the need to establish general methods to apply to engineering projects. There was pioneers like Taylor , Fayol or Gantt, whose contributions made possible the birth of project management in the mid- twentieth century. In the following decades, there was a considerable number of new methodologies, improving concepts and adapting then to the technological and social reality of the moment . This historical journey is addressed in the first block of this project. The different facts reviewed at the first block provide the knowledge to understand the key features among the most important families of methodologies that nowadays are implemented in the business world. It is precisely this point, the presentation and analysis of the most widespread methodologies what is addressed in the second block. Two of the most widespread agile methodologies are detailed: Scrum , eXtreme Programming ( XP ) and a hybrid of both . In order to understand the philosophical shift in terms of project management performed by these methodologies , it is necessary to review what are the most widespread no agile methodologies currently. For this purpose, methodologies like Waterfall or PRINCE2 are explained. Other no agile methodologies are explained as well, but not so deeply in comparison with them. The last section of this project is the conclusion, it tries to answer future questions related to the adoption of agile methodologies. For that reason, the most important milestones are reviewed to provide theoretical and practical solutions and answers to help companies in this migration process toward agile methodologies. Since every company should has a solid technical support for its business development, a considerably part of the time has been applied to make a comparative study of the existing Open Source tools. 25 tools have been installed and tested. Then, 3 tools has been selected to be analyzed deeply. Also, pros and cons of these tools have been exposed in order to suggest a roadmap to offer a real alternative to the existing commercial tools in this business area. The companies that are involved in a migration progress toward a agile methodology can used this study about the available Open Source tools to decide if they can afford the migration process based on one of them or if they should use a commercial tool or a tailor-made software.
Resumo:
In 1979, Lewontin and I borrowed the architectural term “spandrel” (using the pendentives of San Marco in Venice as an example) to designate the class of forms and spaces that arise as necessary byproducts of another decision in design, and not as adaptations for direct utility in themselves. This proposal has generated a large literature featuring two critiques: (i) the terminological claim that the spandrels of San Marco are not true spandrels at all and (ii) the conceptual claim that they are adaptations and not byproducts. The features of the San Marco pendentives that we explicitly defined as spandrel-properties—their necessary number (four) and shape (roughly triangular)—are inevitable architectural byproducts, whatever the structural attributes of the pendentives themselves. The term spandrel may be extended from its particular architectural use for two-dimensional byproducts to the generality of “spaces left over,” a definition that properly includes the San Marco pendentives. Evolutionary biology needs such an explicit term for features arising as byproducts, rather than adaptations, whatever their subsequent exaptive utility. The concept of biological spandrels—including the examples here given of masculinized genitalia in female hyenas, exaptive use of an umbilicus as a brooding chamber by snails, the shoulder hump of the giant Irish deer, and several key features of human mentality—anchors the critique of overreliance upon adaptive scenarios in evolutionary explanation. Causes of historical origin must always be separated from current utilities; their conflation has seriously hampered the evolutionary analysis of form in the history of life.
Resumo:
We describe and test a Markov chain model of microsatellite evolution that can explain the different distributions of microsatellite lengths across different organisms and repeat motifs. Two key features of this model are the dependence of mutation rates on microsatellite length and a mutation process that includes both strand slippage and point mutation events. We compute the stationary distribution of allele lengths under this model and use it to fit DNA data for di-, tri-, and tetranucleotide repeats in humans, mice, fruit flies, and yeast. The best fit results lead to slippage rate estimates that are highest in mice, followed by humans, then yeast, and then fruit flies. Within each organism, the estimates are highest in di-, then tri-, and then tetranucleotide repeats. Our estimates are consistent with experimentally determined mutation rates from other studies. The results suggest that the different length distributions among organisms and repeat motifs can be explained by a simple difference in slippage rates and that selective constraints on length need not be imposed.
Resumo:
Tissue remodeling often reflects alterations in local mechanical conditions and manifests as an integrated response among the different cell types that share, and thus cooperatively manage, an extracellular matrix. Here we examine how two different cell types, one that undergoes the stress and the other that primarily remodels the matrix, might communicate a mechanical stress by using airway cells as a representative in vitro system. Normal stress is imposed on bronchial epithelial cells in the presence of unstimulated lung fibroblasts. We show that (i) mechanical stress can be communicated from stressed to unstressed cells to elicit a remodeling response, and (ii) the integrated response of two cell types to mechanical stress mimics key features of airway remodeling seen in asthma: namely, an increase in production of fibronectin, collagen types III and V, and matrix metalloproteinase type 9 (MMP-9) (relative to tissue inhibitor of metalloproteinase-1, TIMP-1). These observations provide a paradigm to use in understanding the management of mechanical forces on the tissue level.
Resumo:
Long-lasting forms of activity-dependent synaptic plasticity involve molecular modifications that require gene expression. Here, we describe a cellular mechanism that mediates the targeting newly synthesized gene transcripts to individual synapses where they are locally translated. The features of this mechanism have been revealed through studies of the intracellular transport and synaptic targeting of the mRNA for a recently identified immediate early gene called activity-regulated cytoskeleton-associated protein Arc. Arc is strongly induced by patterns of synaptic activity that also induce long-term potentiation, and Arc mRNA is then rapidly delivered into dendrites after episodes of neuronal activation. The newly synthesized Arc mRNA localizes selectively at synapses that recently have been activated, and the encoded protein is assembled into the synaptic junctional complex. The dynamics of trafficking of Arc mRNA reveal key features of the mechanism through which synaptic activity can both induce gene expression and target particular mRNA transcripts to the active synapses.
Resumo:
A capillary electrophoresis method has been developed to study DNA-protein complexes by mobility-shift assay. This method is at least 100 times more sensitive than conventional gel mobility-shift procedures. Key features of the technique include the use of a neutral coated capillary, a small amount of linear polymer in the separation medium, and use of covalently dye-labeled DNA probes that can be detected with a commercially available laser-induced fluorescence monitor. The capillary method provides quantitative data in runs requiring < 20 min, from which dissociation constants are readily determined. As a test case we studied interactions of a developmentally important sea urchin embryo transcription factor, SpP3A2. As little as 2-10 x 10(6) molecules of specific SpP3A2-oligonucleotide complex were reproducibly detected, using recombinant SpP3A2, crude nuclear extract, egg lysates, and even a single sea urchin egg lysed within the capillary column.
Resumo:
Tese de doutoramento, Alterações Climáticas e Políticas de Desenvolvimento Sustentável (Sociologia), Universidade de Lisboa, Instituto de Ciências Sociais, 2016
Resumo:
Labour immigration schemes that effectively attract qualified immigrant workers are a policy priority for many governments. But what are ‘attractive’ labour immigration schemes and policies? To whom are (or should) such policies (be) attractive? In Europe, the US is often portrayed as one of the most ‘attractive’ countries of immigration – if not the most ‘attractive’. This paper aims to analyse and provide a better understanding of the elements of the US immigration system that are supposedly attractive to foreign workers, by examining key features of the current and prospective US labour immigration rules. The paper finds that ‘attractiveness’ in this policy context is a highly malleable and flexible concept: What might be ‘attractive’ to one key stakeholder might not be to another.
Resumo:
This paper focuses on the key features of EU social policy and the way it has been interpreted and seeks to identify new directions for study. EU social policy is considered along two main dimensions: its content and hallmark features and the main approaches to conceptualizing and theorizing it. Rather than the classic negative depiction of EU social policy, this piece suggests that it is more significant than usually allowed, not least because the empirical and theoretical lenses which have been applied to it were developed for other purposes. The implication is that developments in EU social policy are often overlooked, not least in how the EU has carved out a role for itself by constantly framing and reframing discourses relevant to social policy and social problems in an attempt to both influence how social actors at all levels of governance approach policy and secure their acceptance of its role in social policy. Therefore analyzing EU social policy outside of the traditional frames reveals interesting and significant developments especially around innovation in social policy and the attempt to legitimate the EU as a social policy actor.
Resumo:
Eco-innovation has been identified as one of the key drivers of change that need to be harnessed for a sustainable future. Given the complexity of eco-innovation as a concept, there are various challenges to measuring its progress. This paper briefly explores the evolution of the concept of eco-innovation and emphasises its role in the EU 2020 strategy. It then provides an overview of the different measurement approaches and challenges associated with identifying and using indicators for measuring progress in eco-innovation. Within this context, the paper describes the added value and key features of the www.measuring-progress.eu web tool, which aims to improve the way in which policy-makers and others involved in the policy process can access, understand and use indicators for green economy and eco-innovation. The web tool was developed on the basis of a systematic overview by the NETGREEN research team of the large and fragmented body of work in the field of green economy indicators. The paper concludes with a number of messages for policy-makers in the field of the green economy.
Resumo:
In summer 2006 integrated geological, geochemical, hydrological, and hydrochemical studies were carried out in the relict anoxic Mogil'noe Lake (down to 16 m depths) located in the Kil'din Island in the Barents Sea. Chemical and grain size compositions of bottom sediments from the lake (permanently anoxic basin) and from the Baltic Sea deeps (periodically anoxic basins) were compared. Vertical location of the hydrogen sulfide layer boundary in the lake (9-11 m depths) was practically the same from 1974 up to now. Concentrations of suspended matter in the lake in June and July 2006 appeared to be close to its summer concentrations in seawater of the open Baltic Sea. Muds from the Mogil'noe Lake compared to those of the Baltic Sea deeps are characterized by fluid and flake consistency and by pronounced admixtures of sandy and silty fractions (probably of eolic origin). Lacustrine mud contains much plant remains; iron sulfides and vivianite were also found. Concentrations of 22 elements determined in lacustrine bottom sediments were of the same levels as those found here 33 years ago. Concentrations also appeared to be close to those in corresponding grain size types of bottom sediments in the Baltic Sea. Low C_org/N values (aver. 5.0) in muds of the Mogil'noe Lake compared to ones for muds of the Baltic Sea deeps (aver. 10) evidence considerable planktogenic component in organic matter composition of the lacustrine muds. No indications were reveled for anthropogenic contaminations of the lacustrine bottom sediments with toxic metals.
Resumo:
This paper explores young people's (9 to 15 years old) early socialisation into sport. We draw on data from an 18-month-long ethnography of the junior section of an athletics club in England, using field notes, interviews and a psychometric questionnaire. We begin by noting a trend towards increasing numbers of younger children participating in adult-organised, community-based sport. Within this context, we investigate the extent to which Siedentop's [(1995) Junior Sport and the evolution of sport cultures, Keynote presentation to the Junior Sport Forum, Auckland, New Zealand] three main goals for young people's participation in sport, i.e. the educative, public health and elite development, are met in specific, local junior sport settings such as Forest Athletics Club (FAC). We report that most of the young people participating in the Introductory Groups at FAC begin their socialisation into sport by 'sampling' a range of sports and other activities that are available to them. We note the key features of the sampling phase for these young people, including their involvement in sports and other activities in addition to athletics, their reasons for participation, the place of competition and the importance of friendship. We report that FAC created a climate for the Samplers, intentionally or not, conducive to the development of Siedentop's educative goal, and to a lesser extent the public health and elite development goals. In concluding, we note the implications of the study for community-based programmes run by clubs.