873 resultados para Ex-convicts, Employment of
Resumo:
The choice of ethanol (C2H5OH) as carbon source in the Chemical Vapor Deposition (CVD) of graphene on copper foils can be considered as an attractive alternative among the commonly used hydrocarbons, such as methane (CH4) [1]. Ethanol, a safe, low cost and easy handling liquid precursor, offers fast and efficient growth kinetics with the synthesis of fullyformed graphene films in just few seconds [2]. In previous studies of graphene growth from ethanol, various research groups explored temperature ranges lower than 1000 °C, usually reported for methane-assisted CVD. In particular, the 650–850 °C and 900 °C ranges were investigated, respectively for 5 and 30 min growth time [3, 4]. Recently, our group reported the growth of highly-crystalline, few-layer graphene by ethanol-CVD in hydrogen flow (1– 100 sccm) at high temperatures (1000–1070 °C) using growth times typical of CH4-assisted synthesis (10–30 min) [5]. Furthermore, a synthesis time between 20 and 60 s in the same conditions was explored too. In such fast growth we demonstrated that fully-formed graphene films can be grown by exposing copper foils to a low partial pressure of ethanol (up to 2 Pa) in just 20 s [6] and we proposed that the rapid growth is related to an increase of the Cu catalyst efficiency due weak oxidizing nature of ethanol. Thus, the employment of such liquid precursor, in small concentrations, together with a reduced time of growth and very low pressure leads to highly efficient graphene synthesis. By this way, the complete coverage of a copper catalyst surface with high spatial uniformity can be obtained in a considerably lower time than when using methane.
Resumo:
One-dimensional (1D) TiO2 nanostructures are very desirable for providing fascinating properties and features, such as high electron mobility, quantum confinement effects, and high specific surface area. Herein, 1D mesoporous TiO2 nanofibres were prepared using the electrospinning method to verify their potential for use as the photoelectrode of dye-sensitized solar cells (DSSCs). The 1D mesoporous nanofibres, 300 nm in diameter and 10-20 μm in length, were aggregated from anatase nanoparticles 20-30 nm in size. The employment of these novel 1D mesoporous nanofibres significantly improved dye loading and light scattering of the DSSC photoanode, and resulted in conversion cell efficiency of 8.14%, corresponding to an ∼35% enhancement over the Degussa P25 reference photoanode.
Resumo:
"Trust and Collectives" is a compilation of articles: (I) "On Rational Trust" (in Meggle, G. (ed.) Social Facts & Collective Intentionality, Dr. Hänsel-Hohenhausen AG (currently Ontos), 2002), (II) "Simulating Rational Social Normative Trust, Predictive Trust, and Predictive Reliance Between Agents" (M.Tuomela and S. Hofmann, Ethics and Information Technology 5, 2003), (III) "A Collective's Trust in a Collective's action" (Protosociology, 18-19, 2003), and (IV) "Cooperation and Trust in Group Contexts" (R. Tuomela and M.Tuomela, Mind and Society 4/1, 2005 ). The articles are tied together by an introduction that dwells deeply on the topic of trust. (I) presents a somewhat general version of (RSNTR) and some basic arguments. (II) offers an application of (RSNTR) for a computer simulation of trust.(III) applies (RSNTR) to Raimo Tuomela's "we-mode"collectives (i.e. The Philosophy of Social Practices, Cambridge University Press, 2002). (IV) analyzes cooperation and trust in the context of acting as a member of a collective. Thus, (IV) elaborates on the topic of collective agency in (III) and puts the trust account (RSNTR) to work in a framework of cooperation. The central aim of this work is to construct a well-argued conceptual and theoretical account of rational trust, viz. a person's subjectively rational trust in another person vis-à-vis his performance of an action, seen from a first-person point of view. The main method is conceptual and theoretical analysis understood along the lines of reflective equilibrium. The account of rational social normative trust (RSNTR), which is argued and defended against other views, is the result of the quest. The introduction stands on its own legs as an argued presentation of an analysis of the concept of rational trust and an analysis of trust itself (RSNTR). It is claimed that (RSNTR) is "genuine" trust and embedded in a relationship of mutual respect for the rights of the other party. This relationship is the growing site for trust, a causal and conceptual ground, but it is not taken as a reason for trusting (viz. predictive "trust"). Relevant themes such as risk, decision, rationality, control, and cooperation are discussed and the topics of the articles are briefly presented. In this work it is argued that genuine trust is to be kept apart from predictive "trust." When we trust a person vis-à-vis his future action that concerns ourselves on the basis of his personal traits and/or features of the specific situation we have a prediction-like attitude. Genuine trust develops in a relationship of mutual respect for the mutual rights of the other party. Such a relationship is formed through interaction where the parties gradually find harmony concerning "the rules of the game." The trust account stands as a contribution to philosophical research on central social notions and it could be used as a theoretical model in social psychology, economical and political science where interaction between persons and groups are in focus. The analysis could also serve as a model for a trust component in computer simulation of human action. In the context of everyday life the account clarifies the difference between predictive "trust" and genuine trust. There are no fast shortcuts to trust. Experiences of mutual respect for mutual rights cannot be had unless there is respect.
Resumo:
Although empirical evidence suggests the contrary, many asset pricing models assume stock returns to be symmetrically distributed. In this paper it is argued that the occurrence of negative jumps in a firm's future earnings and, consequently, in its stock price, is positively related to the level of network externalities in the firm's product market. If the ex post frequency of these negative jumps in a sample does not equal the ex ante assessed probability of occurrence, the sample is subject to a peso problem. The hypothesis is tested for by regressing the skewness coefficient of a firm’s realised stock return distribution on the firm’s R&D intensity, i.e. the ratio of the firm’s research and development expenditure to its net sales. The empirical results support the technology-related peso problem hypothesis. In samples subject to such a peso problem, the returns are biased up and the variance is biased down.
Resumo:
Owing to high evolutionary divergence, it is not always possible to identify distantly related protein domains by sequence search techniques. Intermediate sequences possess sequence features of more than one protein and facilitate detection of remotely related proteins. We have demonstrated recently the employment of Cascade PSI-BLAST where we perform PSI-BLAST for many 'generations', initiating searches from new homologues as well. Such a rigorous propagation through generations of PSI-BLAST employs effectively the role of intermediates in detecting distant similarities between proteins. This approach has been tested on a large number of folds and its performance in detecting superfamily level relationships is similar to 35% better than simple PSI-BLAST searches. We present a web server for this search method that permits users to perform Cascade PSI-BLAST searches against the Pfam, SCOP and SwissProt databases. The URL for this server is http://crick.mbu.iisc.ernet.in/similar to CASCADE/CascadeBlast.html.
Correlation between enhanced lattice polarizability and high piezoelectric response in BiScO3-PbTiO3
Resumo:
Piezoelectric and ex situ electric-field induced structural studies were carried out on closely spaced compositions in the morphotropic phase boundary region of (1 - x) PbTiO3-(x)BiScO3. While the common approach of zero field structural analysis failed to provide a unique relationship between the anomalous piezoresponse of x = 0.3725 and structural factor(s), ex situ study of electric-field induced structural changes revealed that the composition exhibiting the highest piezoelectric response is the one which also exhibits significantly enhanced polarizability of the lattices of both coexisting (monoclinic and tetragonal) phases. The enhanced lattice polarizability manifests as a significant fraction of the monoclinic phase transforming irreversibly to the tetragonal phase after electric poling. DOI: 10.1103/PhysRevB.87.064106
Resumo:
A water soluble third generation poly(alkyl aryl ether) dendrimer was examined for its ability to solubilize hydrophobic polyaromatic molecules in water and facilitate non-radiative resonance energy transfer between them. One to two orders of magnitude higher aqueous solubilities of pyrene (PY), perylene (PE), acridine yellow (AY) and acridine orange (AO) were observed in presence of a defined concentration of the dendrimer. A reduction in the quantum yield of the donor PY* emission and a partial decrease in lifetime of the donor excited state revealed the occurrence of energy transfer from dendrimer solubilized excited PY to ground state PE molecules, both present within a dendrimer. The energy transfer efficiency was estimated to be similar to 61%. A cascade resonance energy transfer in a three component system, PY*-to-PE-to-AY and PY*-to-PE-to-AO, was demonstrated through incorporation of AY or AO in the two component PY-PE system. In the three-component system, excitation of PY resulted in emission from AY or AO via a cascade energy transfer process. Careful choice of dye molecules with good spectral overlap and the employment of dendrimer as the medium enabled us to expand absorption-emission wavelengths, from similar to 330 nm to similar to 600 nm in aqueous solution. (C) 2015 Elsevier B.V. All rights reserved.
Resumo:
Resumen: Descripciones geográficas y etnológicas de las Indias en autores españoles del siglo XVI. Análisis de los escritos de Pedro Mártir de Anglería y Gonzalo Fernández de Oviedo en los aspectos siguientes: idea de un Nuevo Mundo, crítica y rechazo de las ideas antiguas (mirabilia, monstruos), descripción de plantas y animales, aplicación a las sociedades indígenas de las nociones de “edad de oro” y “bondad natural”, análisis comparativo y valoración de las culturas indígenas en relación con la europea.
Resumo:
[ES] El tabaco está reconocido hoy día como un factor de riesgo muy importante en multitud de enfermedades, representando un grave riesgo para la salud pública. Entre las medidas previstas por la Organización Mundial de la Salud y por la Comisión Europea está la utilización de advertencias sanitarias sobre los efectos del tabaco en las personas. En España actualmente esas advertencias contienen solo texto.
Resumo:
This paper reviews firstly methods for treating low speed rarefied gas flows: the linearised Boltzmann equation, the Lattice Boltzmann method (LBM), the Navier-Stokes equation plus slip boundary conditions and the DSMC method, and discusses the difficulties in simulating low speed transitional MEMS flows, especially the internal flows. In particular, the present version of the LBM is shown unfeasible for simulation of MEMS flow in transitional regime. The information preservation (IP) method overcomes the difficulty of the statistical simulation caused by the small information to noise ratio for low speed flows by preserving the average information of the enormous number of molecules a simulated molecule represents. A kind of validation of the method is given in this paper. The specificities of the internal flows in MEMS, i.e. the low speed and the large length to width ratio, result in the problem of elliptic nature of the necessity to regulate the inlet and outlet boundary conditions that influence each other. Through the example of the IP calculation of the microchannel (thousands long) flow it is shown that the adoption of the conservative scheme of the mass conservation equation and the super relaxation method resolves this problem successfully. With employment of the same measures the IP method solves the thin film air bearing problem in transitional regime for authentic hard disc write/read head length ( ) and provides pressure distribution in full agreement with the generalized Reynolds equation, while before this the DSMC check of the validity of the Reynolds equation was done only for short ( ) drive head. The author suggests degenerate the Reynolds equation to solve the microchannel flow problem in transitional regime, thus provides a means with merit of strict kinetic theory for testing various methods intending to treat the internal MEMS flows.
Resumo:
The co-organized Alliance for Coastal Technologies (ACT) and National Data Buoy Center (NDBC) Workshop "Meteorological Buoy Sensors Workshop" convened in Solomons, Maryland, April 19 to 21,2006, sponsored by the University of Maryland Center for Environmental Science (UMCES) Chesapeake Bay Laboratory (CBL), an ACT partner institution. Participants from various sectors including resource managers and industry representatives collaborated to focus on technologies and sensors that measure the near surface variables of wind speed and direction, barometric pressure, humidity and air temperature. The vendor list was accordingly targeted at companies that produced these types of sensors. The managers represented a cross section of federal, regional and academic marine observing interests from around the country. Workshop discussions focused on the challenges associated with making marine meteorological observations in general and problems that were specific to a particular variable. Discussions also explored methods to mitigate these challenges through the adoption of best practices, improved technologies and increased standardization. Some of the key workshop outcomes and recommendations included: 0cean.US should establish a committee devoted to observations. The committee would have a key role in developing observing standards. The community should adopt the target cost, reliability and performance standards drafted for a typical meteorological package to be used by a regional observing system. A forum should be established to allow users and manufacturers to share best practices for the employment of marine meteorological sensors. The ACT website would host the forum. Federal activities that evaluate meteorological sensors should make their results publicly available. ACT should extend their evaluation process to include meteorological sensors. A follow on workshop should be conducted that covers the observing of meteorological variables not addressed by this workshop. (pdf contains 18 pages)
Resumo:
[ES] Esta investigación ha propuesto un método sencillo, razonablemente económico y fácil de implementar para estudiar la imagen de marca de una ciudad empleando los mapas asociativos (Henderson, Iacobucci y Calder, 1998). Utilizando una combinación de criterios estadísticos e interpretativos se llega a construir un mapa de consenso que refleja las señas de identidad de una ciudad, es decir, las principales asociaciones que realizan sus habitantes cuando piensan en su localidad. Se proporcionan, además, claras orientaciones sobre cómo aplicar este método, el cual debe realizarse periódicamente, con el fin de estudiar la dinámica de las asociaciones mencionadas en función de las acciones estratégicas y tácticas realizadas por los gobernantes y del momento del tiempo en el que se ejecuta. Además, se ilustra su uso con una investigación real, aplicada a la ciudad de Cartagena, y realizada empleando una muestra aleatoria de 195 ciudadanos.
Resumo:
Para a maioria dos municípios brasileiros, a instalação de um aterro sanitário é um desafio, sendo uma das dificuldades o custo elevado. Existem algumas formas de mitigar estes custos e uma delas é através do mercado de emissões. Com planejamento prévio suficiente, é possível queimar o metano gerado através da degradação do resíduo, podendo resultar em benefícios para o aterro tanto através do aproveitamento (geração de energia ou venda direta) quanto recebimento de algum tipo de certificado de emissões negociável. Incluído neste planejamento prévio suficiente está a realização da estimativa ex-ante de emissão de metano para saber previamente qual será o aproveitamento mais indicado e a eventual receita oriunda da queima. Quando analisados os projetos de MDL feitos em aterros sanitários, pode ser notado que estas estimativas são muitas vezes mal feitas, gerando valores estimados muito acima do realmente observado durante a operação. Este erro acarreta uma perda de credibilidade deste tipo de projeto, já que o número esperado é raramente alcançado. Existem alguns fatores que contribuem para esta discrepância de valores, sendo problemas operacionais (como exemplo podem ser citados deficiência no sistema de captura do biogás e problemas na captação e recirculação de lixiviado) e de modelagem (utilização de valores de entrada experimentais obtidos sob situações muito diferentes das encontradas nos aterros brasileiros, por exemplo) os possíveis principais vilões. Este trabalho visa apresentar e discutir os principais problemas na realização de estimativas prévias de emissão de metano em aterros sanitários utilizando projetos brasileiros de MDL registrados e que estejam atualmente emitindo créditos de carbono como base para analisar a qualidade das estimativas feitas atualmente. Além disto, busca-se também entrevistar profissionais da área para tentar obter diferentes pontos de vista sobre esta questão. Fica claro que os valores estimados, de um modo geral, são entre 40 e 50% superiores aos observados. Metade dos especialistas aponta problemas operacionais diversos como os principais contribuintes desta diferença, mas problemas na modelagem parecem influenciar decisivamente na realização das estimativas. A utilização de valores de entrada no modelo precisa ser criteriosamente analisada e devem ser utilizados números obtidos através de pesquisas que representem a realidade do aterro em questão.
Resumo:
Esta tese se insere no conjunto de pesquisas que procura entender como funcionam as eleições no Brasil. Especificamente, o objetivo é investigar a propaganda negativa durante as eleições presidenciais. Para tal foram desenvolvidos cinco capítulos. O primeiro situa o leitor no debate normativo sobre o papel da propaganda negativa para a democracia eleitoral. Nele, é debatida a importância dos ataques em uma série de circunstâncias, como mobilização política, ambiente informacional e decisão do voto. O segundo capítulo constitui ampla análise do conteúdo da propaganda negativa exibida no âmbito do Horário Gratuito de Propaganda Eleitoral durante as eleições presidenciais de 1989, 1994, 1998, 2002, 2006 e 2010, primeiro e segundo turnos. A metodologia seguiu as orientações formuladas por Figueiredo et all. (1998), mas adaptadas para as especificidades da propaganda negativa. Neste objetivo, tendências interessantes foram descobertas, a mais interessante, sem dúvida, é o baixo índice de ataques ocorrido entre os candidatos. O terceiro busca investigar o uso estratégico das inserções durante as campanhas presidenciais. Debato o caráter regulamentado do modelo brasileiro de propaganda. Ainda assim, aponto estratégias divergentes no uso estratégico das inserções negativas, sendo o horário noturno o lócus predominante dos ataques. O quarto capítulo procura criar um modelo de campanha negativa com base na teoria dos jogos. No modelo, procuro responder às seguintes questões: quem ataca quem, quando e por quê? Argumento que a propaganda negativa é o último recurso utilizado pelos candidatos na conquista por votos. Ela tem como propósito central alterar a tendência do adversário. Por essa razão, é utilizada principalmente por candidatos em situações de desvantagem nos índices de intenção de voto. O quinto e último capítulo desenvolve modelo estatístico para medir o impacto da propaganda negativa nos índices de intenção de voto.
Resumo:
O presente estudo tem como tema central o trabalho docente em instituições privadas de ensino superior, em um contexto em que os trabalhadores estão subsumidos à ordem do capitalismo flexível e às suas diversas formas de dominação. Analisam-se os modos de organização do trabalho no campo educacional, e suas repercussões na educação superior. Realizam-se considerações sobre os sentidos de esfera pública e esfera privada, no âmbito do atual Estado burguês. Destacam-se alguns momentos da história da educação superior brasileira, e a trajetória de desenvolvimento desse nível de ensino no Brasil e em especial no Maranhão, estado que foi escolhido para ser o campo empírico da tese. Nesse percurso, dá-se ênfase para alguns dispositivos legais pós-LDB (1996) que facilitaram a expansão do setor privado/mercantil. O referencial teórico-metodológico assenta-se nas teorizações marxianas e marxistas, portanto, na articulação com a pesquisa empírica, fazendo-se necessário ultrapassar os limites das manifestações fenomênicas, para buscar as suas raízes, que, por sua vez, não são imediatamente observáveis. Utiliza-se, também, para essa finalidade a Teoria Social do Discurso, elaborada por Norman Fairclough. A partir do estudo de campo realizado é possível identificar um contexto de intensa precarização nas relações de trabalho dos professores nessas instituições, com a combinação de muitos elementos, objetivos e subjetivos, no complexo cotidiano desse trabalhador, entre eles: controles e pressões no cumprimento de prazos, salários rebaixados, cobranças, constrangimentos, sofrimentos, dores, ausência de democracia e de reconhecimento por parte dos superiores hierárquicos, sobrecarga de trabalho, desânimo, mas, também, transgressões de regras e normas, enfretamentos, satisfações, prazeres, momentos de criatividade e motivação, esses últimos componentes vividos, especialmente na relação com os alunos. Essas situações e sentimentos que transitam entre a dualidade prazer-sofrimento geram muitas e diferentes repercussões na saúde dos professores e para essa discussão lança-se mão de autores da Psicodinâmica do Trabalho. Conclui-se que para se pensar a possibilidade de uma educação humanizadora e avessa à perspectiva pragmática e mercantilista, tão em voga na atualidade, tornam-se necessários a superação do modelo neoliberal, a retomada da esfera pública como central e estratégica e a defesa do trabalho docente, permeado por dignidade, sentido e reconhecimento.