921 resultados para Bottom-up processes
Resumo:
Une composante PRE (potentiel relié aux événements) nommée la N2pc est associée au déploiement de l’attention visuo-spatiale. Nous avons examiné la modulation de la N2pc en fonction de la présence ou l’absence d’une cible, la séparation physique de deux items saillants ainsi que leur similarité. Les stimuli présentés étaient des lignes variant selon leur orientation et leur couleur, les items saillants étant bleus et les items non saillants, gris. Les résultats démontrent une augmentation de l’amplitude de la N2pc en lien avec la distance séparant deux items saillants ainsi qu’une augmentation de l’amplitude de la N2pc lorsque les items saillants avaient des orientations plus similaires. Aucune interaction entre ces deux facteurs n’a été observée. Une interaction significative a par contre été observée entre la présence/absence d’une cible et la similarité du distracteur avec la cible recherchée. Ces résultats montrent une dissociation entre l’activité reliée à la distance entre les items saillants et celle qui est reliée à la similarité distracteur-cible, car ils ne peuvent pas être expliqués par un seul mécanisme. Donc, les résultats suggèrent qu’une combinaison de traitement ascendant et de traitement descendant module la composante N2pc.
Resumo:
Les facteurs climatiques ainsi bien que les facteurs non-climatiques doivent être pris en considération dans le processus d'adaptation de l'agriculture aux changements et à la variabilité climatiques (CVC). Ce changement de paradigme met l'agent humain au centre du processus d'adaptation, ce qui peut conduire à une maladaptation. Suite aux débats sur les changements climatiques qui ont attiré l'attention scientifique et publique dans les années 1980 et 1990, l'agriculture canadienne est devenue un des points focaux de plusieurs études pionnières sur les CVC, un phénomène principalement dû à l’effet anthropique. Pour faire face aux CVC, ce n’est pas seulement la mitigation qui est importante mais aussi l’adaptation. Quand il s'agit de l'adaptation, c'est plutôt la variabilité climatique qui nous intéresse que simplement les augmentations moyennes des températures. L'objectif général de ce mémoire de maîtrise est d'améliorer la compréhension des processus d'adaptation et de construction de la capacité d'adaptation ai niveau de la ferme et de la communauté agricole à travers un processus ascendant, c’est-à-dire en utilisant l'approche de co-construction (qui peut également être considéré comme une stratégie d'adaptation en soi), pour développer une gestion et des outils de planification appropriés aux parties prenantes pour accroître ainsi la capacité d'adaptation de la communauté agricole. Pour y arriver, l'approche grounded theory est utilisée. Les résultats consistent de cinq catégories interdépendantes de codes élargis, conceptuellement distinctes et avec un plus grand niveau d'abstraction. La MRC du Haut-Richelieu a été choisie comme étude de cas en raison de plusieurs de ses dimensions agricoles, à part de ses conditions biophysiques favorables. 15 entrevues ont été menées avec les agriculteurs. Les résultats montrent que si certains agriculteurs ont reconnu les côtés positifs et négatifs des CVC, d’autres sont très optimistes à ce sujet comme se ils ne voient que le côté positif; d'où la nécessité de voir les deux côtés des CVC. Aussi, il y a encore une certaine incertitude liée aux CVC, qui vient de la désinformation et la désensibilisation des agriculteurs principalement en ce qui concerne les causes des CVC ainsi que la nature des événements climatiques. En outre, et compte tenu du fait que l'adaptation a plusieurs caractéristiques et types, il existe de nombreux types d'adaptation qui impliquent à la fois l'acteur privé et le gouvernement. De plus, les stratégies d'adaptation doivent être élaborées conjointement par les agriculteurs en concert avec d'autres acteurs, à commencer par les agronomes, car ils servent en tant que relais important entre les agriculteurs et d'autres parties prenantes telles que les institutions publiques et les entreprises privées.
Resumo:
Alors que l’intérêt pour les processus d’intégration des immigrants et des minorités ethniques est en pleine croissance parmi les chercheurs européens, les facteurs qui expliquent les différentes formes de participation civique et politique doivent être examinés plus en profondeur. Prenant pour base la littérature sur l’immigration, cette étude examine la question de recherche suivante: Comment peut-on expliquer les variations des formes de participation civique et politique des activistes issus de l’immigration au niveau local? Afin de répondre à cette question, cette étude identifie les formes de participation de la part d’activistes issus de l’immigration dans quatre villes Italiennes et examine les discours et les pratiques de multiples acteurs impliqués dans le domaine de l’immigration dans un contexte national d’hostilité croissante. Cette thèse soutient que pour comprendre différentes formes de participation, il est important de considérer non seulement l’État et les acteurs institutionnels, mais aussi les acteurs non-institutionnels et examiner comment ces derniers influencent les opportunités ainsi que les restrictions à la participation. Par ailleurs, cette recherche examine les canaux conventionnels et non-conventionnels dans quatre villes italiennes et étudie les activistes issus de l’immigration comme des acteurs politiques pertinents, capables de se mobiliser et d’influencer la participation à travers leur interaction et alliances avec les acteurs de la société d’accueil. Cette recherche a permis de produire trois résultats. Le premier montre que les approches d’intégration adoptées par les acteurs sont importantes. Cette étude a identifié trois approches d’intégration: 1) « welfariste », basée sur l’idée que les immigrants sont dans le besoin et doivent donc recevoir des services; 2) interculturelle, basée sur l’idée que les immigrants sont de futurs citoyens et que l’intégration est réciproque; 3) promotion des droits politiques, basée sur l’idée que les immigrants ont des droits politiques fondamentaux ; et qui encourage l’ouverture des canaux de participation politique, surtout aux immigrants privés du droit de vote local. L’analyse empirique démontre que, alors que l’approche welfariste n’encourage pas la participation parce qu’elle conçoit les immigrants comme des acteurs passifs, les autres deux approches ont respectivement un impact sur les formes de participation civique et politique. La deuxième conclusion souligne le rôle des acteurs de gauche. En particulier, cette étude montre que les acteurs qui ouvrent de canaux pour la participation ne sont pas uniquement les acteurs de gauche modérée, comme les autorités locales, les partis politiques et les syndicats, mais aussi les groupes de gauche radicale et non-institutionnelle. Chaque acteur de gauche comprend et agit différemment par rapport aux sujets de l’immigration et de la participation et ce fait influence comment les activistes issues de l’immigration se mobilisent. La troisième conclusion met en évidence le rôle de la perception des opportunités par les activistes issus de l’immigration et la façon avec laquelle ils s’approprient les discours et les pratiques des acteurs de gauche. Ce travail démontre que l’ouverture de canaux est possible grâce à l’engagement de personnes issues de l’immigration qui agissent à travers les opportunités qui leurs sont offertes, créent des alliances avec la gauche et défient les discours et pratiques des acteurs locaux.
Resumo:
L'écologie urbaine est un nouveau champ de recherche qui cherche à comprendre les structures et les patrons des communautés et des écosystèmes situés dans des paysages urbains. Les petits plans d’eau sont connus comme des écosystèmes aquatiques qui peuvent contenir une biodiversité considérable pour plusieurs groupes taxonomiques (oiseaux, amphibiens, macroinvertébrés), ce qui en fait des écosystèmes intéressants pour les études de conservation. Cependant, la biodiversité du zooplancton, un élément central des réseaux trophiques aquatiques, n’est pas entièrement connue pour les plans d’eaux urbains et devrait être mieux décrite et comprise. Cette étude a évalué les patrons de biodiversité des communautés zooplanctoniques dans des plans d’eau urbains sur l’Ile de Montréal et leurs sources de variation. Des suggestions pour l’évaluation et la conservation de la biodiversité sont aussi discutées. La biodiversité zooplanctonique des plans d’eaux urbains s’est avérée être assez élevée, avec les cladocères et les rotifères montrant les contributions à la diversité gamma et bêta les plus élevées. Sur l’ensemble des plans d’eau, il y avait une corrélation négative entre les contributions à la bêta diversité des cladocères et des rotifères. Au niveau de chaque plan d'eau, la zone littorale colonisée par des macrophytes s'est avérée être un habitat important pour la biodiversité zooplactonique, contribuant considérablement à la richesse en taxons, souvent avec une différente composition en espèces. Les communautés zooplanctoniques répondaient aux facteurs ascendants et descendants, mais aussi aux pratiques d’entretien, car le fait de vider les plans d’eau en hiver affecte la composition des communautés zooplanctoniques. Les communautés de cladocères dans ces plans d’eau possédaient des quantités variables de diversité phylogénétique, ce qui permet de les classer afin de prioriser les sites à préserver par rapport à la diversité phylogénétique. Le choix des sites à préserver afin de maximiser la diversité phylogénétique devrait être correctement établi, afin d’eviter de faire des choix sous-optimaux. Cependant, pour des taxons tels que les cladocères, pour lesquels les relations phylogénétiques demeurent difficiles à établir, placer une confiance absolue dans un seul arbre est une procédure dangereuse. L’incorporation de l’incertitude phylogénétique a démontré que, lorsqu’elle est prise en compte, plusieurs différences potentielles entre la diversité phylogenétique ne sont plus supportées. Les patrons de composition des communautés différaient entre les plans d’eau, les mois et les zones d’échantillonnage. Etant donné les intéractions sont significatives entres ces facters; ceci indique que tous ces facteurs devraient êtres considérés. L’urbanisation ne semblait pas sélectionner pour un type unique de composition des groupes alimentaires, étant donné que les communautés pouvaient changer entres des assemblages de types alimentaires différents. Les variables environnementales, surtout la couverture du plan d’eau en macrophytes, étaient des facteurs importants pour la biodiversité zooplanctonique, affectant la richesse spécifique de divers groupes taxonomiques et alimentaires. Ces variables affectaient aussi la composition des communautés, mais dans une moindre mesure, étant des variables explicatives modestes, ce qui indiquerait le besoin de considérer d’autres processus.
Resumo:
In now-a-days semiconductor and MEMS technologies the photolithography is the working horse for fabrication of functional devices. The conventional way (so called Top-Down approach) of microstructuring starts with photolithography, followed by patterning the structures using etching, especially dry etching. The requirements for smaller and hence faster devices lead to decrease of the feature size to the range of several nanometers. However, the production of devices in this scale range needs photolithography equipment, which must overcome the diffraction limit. Therefore, new photolithography techniques have been recently developed, but they are rather expensive and restricted to plane surfaces. Recently a new route has been presented - so-called Bottom-Up approach - where from a single atom or a molecule it is possible to obtain functional devices. This creates new field - Nanotechnology - where one speaks about structures with dimensions 1 - 100 nm, and which has the possibility to replace the conventional photolithography concerning its integral part - the self-assembly. However, this technique requires additional and special equipment and therefore is not yet widely applicable. This work presents a general scheme for the fabrication of silicon and silicon dioxide structures with lateral dimensions of less than 100 nm that avoids high-resolution photolithography processes. For the self-aligned formation of extremely small openings in silicon dioxide layers at in depth sharpened surface structures, the angle dependent etching rate distribution of silicon dioxide against plasma etching with a fluorocarbon gas (CHF3) was exploited. Subsequent anisotropic plasma etching of the silicon substrate material through the perforated silicon dioxide masking layer results in high aspect ratio trenches of approximately the same lateral dimensions. The latter can be reduced and precisely adjusted between 0 and 200 nm by thermal oxidation of the silicon structures owing to the volume expansion of silicon during the oxidation. On the basis of this a technology for the fabrication of SNOM calibration standards is presented. Additionally so-formed trenches were used as a template for CVD deposition of diamond resulting in high aspect ratio diamond knife. A lithography-free method for production of periodic and nonperiodic surface structures using the angular dependence of the etching rate is also presented. It combines the self-assembly of masking particles with the conventional plasma etching techniques known from microelectromechanical system technology. The method is generally applicable to bulk as well as layered materials. In this work, layers of glass spheres of different diameters were assembled on the sample surface forming a mask against plasma etching. Silicon surface structures with periodicity of 500 nm and feature dimensions of 20 nm were produced in this way. Thermal oxidation of the so structured silicon substrate offers the capability to vary the fill factor of the periodic structure owing to the volume expansion during oxidation but also to define silicon dioxide surface structures by selective plasma etching. Similar structures can be simply obtained by structuring silicon dioxide layers on silicon. The method offers a simple route for bridging the Nano- and Microtechnology and moreover, an uncomplicated way for photonic crystal fabrication.
Resumo:
The ongoing growth of the World Wide Web, catalyzed by the increasing possibility of ubiquitous access via a variety of devices, continues to strengthen its role as our prevalent information and commmunication medium. However, although tools like search engines facilitate retrieval, the task of finally making sense of Web content is still often left to human interpretation. The vision of supporting both humans and machines in such knowledge-based activities led to the development of different systems which allow to structure Web resources by metadata annotations. Interestingly, two major approaches which gained a considerable amount of attention are addressing the problem from nearly opposite directions: On the one hand, the idea of the Semantic Web suggests to formalize the knowledge within a particular domain by means of the "top-down" approach of defining ontologies. On the other hand, Social Annotation Systems as part of the so-called Web 2.0 movement implement a "bottom-up" style of categorization using arbitrary keywords. Experience as well as research in the characteristics of both systems has shown that their strengths and weaknesses seem to be inverse: While Social Annotation suffers from problems like, e. g., ambiguity or lack or precision, ontologies were especially designed to eliminate those. On the contrary, the latter suffer from a knowledge acquisition bottleneck, which is successfully overcome by the large user populations of Social Annotation Systems. Instead of being regarded as competing paradigms, the obvious potential synergies from a combination of both motivated approaches to "bridge the gap" between them. These were fostered by the evidence of emergent semantics, i. e., the self-organized evolution of implicit conceptual structures, within Social Annotation data. While several techniques to exploit the emergent patterns were proposed, a systematic analysis - especially regarding paradigms from the field of ontology learning - is still largely missing. This also includes a deeper understanding of the circumstances which affect the evolution processes. This work aims to address this gap by providing an in-depth study of methods and influencing factors to capture emergent semantics from Social Annotation Systems. We focus hereby on the acquisition of lexical semantics from the underlying networks of keywords, users and resources. Structured along different ontology learning tasks, we use a methodology of semantic grounding to characterize and evaluate the semantic relations captured by different methods. In all cases, our studies are based on datasets from several Social Annotation Systems. Specifically, we first analyze semantic relatedness among keywords, and identify measures which detect different notions of relatedness. These constitute the input of concept learning algorithms, which focus then on the discovery of synonymous and ambiguous keywords. Hereby, we assess the usefulness of various clustering techniques. As a prerequisite to induce hierarchical relationships, our next step is to study measures which quantify the level of generality of a particular keyword. We find that comparatively simple measures can approximate the generality information encoded in reference taxonomies. These insights are used to inform the final task, namely the creation of concept hierarchies. For this purpose, generality-based algorithms exhibit advantages compared to clustering approaches. In order to complement the identification of suitable methods to capture semantic structures, we analyze as a next step several factors which influence their emergence. Empirical evidence is provided that the amount of available data plays a crucial role for determining keyword meanings. From a different perspective, we examine pragmatic aspects by considering different annotation patterns among users. Based on a broad distinction between "categorizers" and "describers", we find that the latter produce more accurate results. This suggests a causal link between pragmatic and semantic aspects of keyword annotation. As a special kind of usage pattern, we then have a look at system abuse and spam. While observing a mixed picture, we suggest that an individual decision should be taken instead of disregarding spammers as a matter of principle. Finally, we discuss a set of applications which operationalize the results of our studies for enhancing both Social Annotation and semantic systems. These comprise on the one hand tools which foster the emergence of semantics, and on the one hand applications which exploit the socially induced relations to improve, e. g., searching, browsing, or user profiling facilities. In summary, the contributions of this work highlight viable methods and crucial aspects for designing enhanced knowledge-based services of a Social Semantic Web.
Resumo:
Multisensory integration involves bottom-up as well as top-down processes. We investigated the influences of top-down control on the neural responses to multisensory stimulation using EEG recording and time-frequency analyses. Participants were stimulated at the index or thumb of the left hand, using tactile vibrators mounted on a foam cube. Simultaneously they received a visual distractor from a light emitting diode adjacent to the active vibrator (spatially congruent trial) or adjacent to the inactive vibrator (spatially incongruent trial). The task was to respond to the elevation of the tactile stimulus (upper or lower), while ignoring the simultaneous visual distractor. To manipulate top-down control on this multisensory stimulation, the proportion of spatially congruent (vs. incongruent) trials was changed across blocks. Our results reveal that the behavioral cost of responding to incongruent than congruent trials (i.e., the crossmodal congruency effect) was modulated by the proportion of congruent trials. Most importantly, the EEG gamma band response and the gamma-theta coupling were also affected by this modulation of top-down control, whereas the late theta band response related to the congruency effect was not. These findings suggest that gamma band response is more than a marker of multisensory binding, being also sensitive to the correspondence between expected and actual multisensory stimulation. By contrast, theta band response was affected by congruency but appears to be largely immune to stimulation expectancy.
Resumo:
Between 1972 and 2001, the English late-modernist poet Roy Fisher provided the text for nine separate artist's books produced by Ron King at the Circle Press. Taken together, as Andrew Lambirth has written, the Fisher-King collaborations represent a sustained investigation of the various ways in which text and image can be integrated, breaking the mould of the codex or folio edition, and turning the book into a sculptural object. From the three-dimensional pop-up designs of Bluebeard's Castle (1973), each representing a part of the edifice (the portcullis, the armoury and so on), to ‘alphabet books’ such as The Half-Year Letters (1983), held in an ingenious french-folded concertina which can be stretched to over a metre long or compacted to a pocketbook, the project of these art books is to complicate their own bibliographic codes, and rethink what a book can be. Their folds and reduplications give a material form to the processes by which meanings are produced: from the discovery, in Top Down, Bottom Up (1990), of how to draw on both sides of the page at the same time, to the developments of The Left-Handed Punch (1987) and Anansi Company (1992), where the book becomes first a four-dimensional theatre space, in which a new version of Punch and Judy is played out by twelve articulated puppets, and then a location for characters that are self-contained and removable, in the form of thirteen hand-made wire and card rod-puppets. Finally, in Tabernacle (2001), a seven-drawer black wooden cabinet that stands foursquare like a sculpture (and sells to galleries and collectors for over three thousand pounds), the conception of the book and the material history of print are fully undone and reconstituted. This paper analyses how the King-Fisher art books work out their radically material poetics of the book; how their emphasis on collaboration, between artist and poet, image and text, and also book and reader – the construction of meaning becoming a co-implicated process – continuously challenges hierarchies and fixities in our conception of authorship; and how they re-think the status of poetic text and the construction of the book as material object.
Resumo:
Market liberalization in emerging-market economies and the entry of multinational firms spur significant changes to the industry/institutional environment faced by domestic firms. Prior studies have described how such changes tend to be disruptive to the relatively backward domestic firms, and negatively affect their performance and survival prospects. In this paper, we study how domestic supplier firms may adapt and continue to perform, as market liberalization progresses, through catch-up strategies aimed at integrating with the industry's global value chain. Drawing on internalization theory and the literatures on upgrading and catch-up processes, learning and relational networks, we hypothesize that, for continued performance, domestic supplier firms need to adapt their strategies from catching up initially through technology licensing/collaborations and joint ventures with multinational enterprises (MNEs) to also developing strong customer relationships with downstream firms (especially MNEs). Further, we propose that successful catch-up through these two strategies lays the foundation for a strategy of knowledge creation during the integration of domestic industry with the global value chain. Our analysis of data from the auto components industry in India during the period 1992–2002, that is, the decade since liberalization began in 1991, offers support for our hypotheses.
Resumo:
Black carbon aerosol plays a unique and important role in Earth’s climate system. Black carbon is a type of carbonaceous material with a unique combination of physical properties. This assessment provides an evaluation of black-carbon climate forcing that is comprehensive in its inclusion of all known and relevant processes and that is quantitative in providing best estimates and uncertainties of the main forcing terms: direct solar absorption; influence on liquid, mixed phase, and ice clouds; and deposition on snow and ice. These effects are calculated with climate models, but when possible, they are evaluated with both microphysical measurements and field observations. Predominant sources are combustion related, namely, fossil fuels for transportation, solid fuels for industrial and residential uses, and open burning of biomass. Total global emissions of black carbon using bottom-up inventory methods are 7500 Gg yr�-1 in the year 2000 with an uncertainty range of 2000 to 29000. However, global atmospheric absorption attributable to black carbon is too low in many models and should be increased by a factor of almost 3. After this scaling, the best estimate for the industrial-era (1750 to 2005) direct radiative forcing of atmospheric black carbon is +0.71 W m�-2 with 90% uncertainty bounds of (+0.08, +1.27)Wm�-2. Total direct forcing by all black carbon sources, without subtracting the preindustrial background, is estimated as +0.88 (+0.17, +1.48) W m�-2. Direct radiative forcing alone does not capture important rapid adjustment mechanisms. A framework is described and used for quantifying climate forcings, including rapid adjustments. The best estimate of industrial-era climate forcing of black carbon through all forcing mechanisms, including clouds and cryosphere forcing, is +1.1 W m�-2 with 90% uncertainty bounds of +0.17 to +2.1 W m�-2. Thus, there is a very high probability that black carbon emissions, independent of co-emitted species, have a positive forcing and warm the climate. We estimate that black carbon, with a total climate forcing of +1.1 W m�-2, is the second most important human emission in terms of its climate forcing in the present-day atmosphere; only carbon dioxide is estimated to have a greater forcing. Sources that emit black carbon also emit other short-lived species that may either cool or warm climate. Climate forcings from co-emitted species are estimated and used in the framework described herein. When the principal effects of short-lived co-emissions, including cooling agents such as sulfur dioxide, are included in net forcing, energy-related sources (fossil fuel and biofuel) have an industrial-era climate forcing of +0.22 (�-0.50 to +1.08) W m-�2 during the first year after emission. For a few of these sources, such as diesel engines and possibly residential biofuels, warming is strong enough that eliminating all short-lived emissions from these sources would reduce net climate forcing (i.e., produce cooling). When open burning emissions, which emit high levels of organic matter, are included in the total, the best estimate of net industrial-era climate forcing by all short-lived species from black-carbon-rich sources becomes slightly negative (�-0.06 W m�-2 with 90% uncertainty bounds of �-1.45 to +1.29 W m�-2). The uncertainties in net climate forcing from black-carbon-rich sources are substantial, largely due to lack of knowledge about cloud interactions with both black carbon and co-emitted organic carbon. In prioritizing potential black-carbon mitigation actions, non-science factors, such as technical feasibility, costs, policy design, and implementation feasibility play important roles. The major sources of black carbon are presently in different stages with regard to the feasibility for near-term mitigation. This assessment, by evaluating the large number and complexity of the associated physical and radiative processes in black-carbon climate forcing, sets a baseline from which to improve future climate forcing estimates.
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
This thesis pursuits to contextualize the theoretical debate between the implementation of public education policy of the Federal Government focused in a distance learning and legal foundations for its enforcement, in order to raise questions and comments on the topic in question. Its importance is back to provide scientific input and can offer to the academy, particularly in the UFRN, and elements of society to question and rethink the complex relationship between the socio-economic and geographic access to higher education. It consists of a descriptive study on the institutionalization of distance education in UFRN as a mechanism for expanding access to higher education, for both, the research seeks to understand if the distance undergraduate courses offered by the UAB system and implemented at UFRN, promote expanding access to higher education, as it is during implementation that the rules, routines and social processes are converted from intentions to action. The discussion of this study lasted between two opposing views of Implementation models: Top-down and Bottom-up. It is worth noting that the documents PNE, PDE and programs and UAB MEETING reflect positively in improving the educational level of the population of the country It is a qualitative study, using the means Bibliographic, Document and Field Study, where they were performed 04 (four) in 2010 interviews with the management framework SEDIS / UAB in UFRN. The data were analyzed and addressed through techniques: Document Analysis and Content Analysis. The results show that the process of implementation of distance education at UFRN is in progress. According to our results, the research objective is achieved, but there was a need to rethink the conditions of the infrastructure of poles, the structure of the academic calendar, the management of the SEDIS UFRN, regarding the expansion of existing vacancies and the supply of new courses by the need for a redesign as the Secretariat's ability to hold the offerings of undergraduate courses offered by the Federal Government to be implemented in the institution. It was also found that levels of evasion still presents a challenge to the teaching model. Given the context, we concluded that the greatest contribution of UAB and consequently UFRN by distance learning for undergraduate courses (Bachelor in Mathematics, Physics, Chemistry, Geography and Biological Sciences, beyond the bachelor's degrees in Business and Public Administration ) is related to increasing the number of vacancies and accessibility of a population that was previously deprived of access to university
Resumo:
Bottom-up methods to obtain nanocrystals usually result in metastable phases, even in processes carried out at room temperature or under soft annealing conditions. However, stable phases, often associated with anisotropic shapes, are obtained in only a few special cases. In this paper we report on the synthesis of two well-studied oxides-titanium and zirconium oxide-in the nanometric range, by a novel route based on the decomposition of peroxide complexes of the two metals under hydrothermal soft conditions, obtaining metastable and stable phases in both cases through transformation. High-resolution transmission electron microscopy analysis reveals the existence of typical defects relating to growth by the oriented attachment mechanism in the stable crystals. The results suggest that the mechanism is associated to the phase transformation of these structures.
Resumo:
Background and aimsThe protocarnivorous plant Paepalanthus bromelioides (Eriocaulaceae) is similar to bromeliads in that this plant has a rosette-like structure that allows rainwater to accumulate in leaf axils (i.e. phytotelmata). Although the rosettes of P. bromelioides are commonly inhabited by predators (e.g. spiders), their roots are wrapped by a cylindrical termite mound that grows beneath the rosette. In this study it is predicted that these plants can derive nutrients from recycling processes carried out by termites and from predation events that take place inside the rosette. It is also predicted that bacteria living in phytotelmata can accelerate nutrient cycling derived from predators.MethodsThe predictions were tested by surveying plants and animals, and also by performing field experiments in rocky fields from Serra do Cipó, Brazil, using natural abundance and enriched isotopes of 15N. Laboratory bioassays were also conducted to test proteolytic activities of bacteria from P. bromelioides rosettes.Key ResultsAnalyses of 15N in natural nitrogen abundances showed that the isotopic signature of P. bromelioides is similar to that of carnivorous plants and higher than that of non-carnivorous plants in the study area. Linear mixing models showed that predatory activities on the rosettes (i.e. spider faeces and prey carcass) resulted in overall nitrogen contributions of 26·5 % (a top-down flux). Although nitrogen flux was not detected from termites to plants via decomposition of labelled cardboard, the data on 15N in natural nitrogen abundance indicated that 67 % of nitrogen from P. bromelioides is derived from termites (a bottom-up flux). Bacteria did not affect nutrient cycling or nitrogen uptake from prey carcasses and spider faeces.ConclusionsThe results suggest that P. bromelioides derive nitrogen from associated predators and termites, despite differences in nitrogen cycling velocities, which seem to have been higher in nitrogen derived from predators (leaves) than from termites (roots). This is the first study that demonstrates partitioning effects from multiple partners in a digestion-based mutualism. Despite most of the nitrogen being absorbed through their roots (via termites), P. bromelioides has all the attributes necessary to be considered as a carnivorous plant in the context of digestive mutualism. © 2012 The Author. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)