939 resultados para Term of protection
Resumo:
The study of intuition is an emerging area of research in psychology, social sciences, and business studies. It is increasingly of interest to the study of management, for example in decision-making as a counterpoint to structured approaches. Recently work has been undertaken to conceptualize a construct for the intuitive nature of technology. However to-date there is no common under-standing of the term intuition in information systems (IS) research. This paper extends the study of intuition in IS research by using exploratory research to cate-gorize the use of the word “intuition” and related terms in papers published in two prominent IS journals over a ten year period. The entire text of MIS Quarterly and Information Systems Research was reviewed for the years 1999 through 2008 using searchable PDF versions of these publications. As far as could be deter-mined, this is the first application of this approach in the analysis of the text of IS academic journals. The use of the word “intuition” and related terms was catego-rized using coding consistent with Grounded Theory. The focus of this research was on the first two stages of Grounded Theory analysis - the development of codes and constructs. Saturation of coding was not reached: an extended review of these publications would be required to enable theory development. Over 400 incidents of the use of “intuition”, and related terms were found in the articles reviewed. The most prominent use of the term of “intuition” was coded as “Intui-tion as Authority” in which intuition was used to validate a research objective or finding; representing approximately 37 per cent of codes assigned. The second most common coding occurred in research articles with mathematical analysis, representing about 19 per cent of the codes assigned, for example where a ma-thematical formulation or result was “intuitive”. The possibly most impactful use of the term “intuition” was “Intuition as Outcome”, representing approximately 7 per cent of all coding, which characterized research results as adding to the intui-tive understanding of a research topic or phenomena. This research contributes to a greater theoretical understanding of intuition enabling insight into the use of intuition, and the eventual development of a theory on the use of intuition in academic IS research publications. It also provides potential benefits to practi-tioners by providing insight into and validation of the use of intuition in IS man-agement. Research directions include the creation of reflective and/or formative constructs for intuition in information systems research.
Resumo:
This article considers whether, in the context of armed conflicts, certain non-refoulement obligations of non-belligerent States can be derived from the 1949 Geneva Conventions. According to Common Article 1 (CA1) thereof, all High Contracting Parties (HCPs) undertake to ‘respect and to ensure respect’ for the four conventions ‘in all circumstances’. It is contended that CA1 applies both in international armed conflicts (IACs) and in non-international armed conflicts (NIACs). In turn, it is suggested that Common Article 3 (CA3) which regulates conduct in NIACs serves as a ‘minimum yardstick’ also applicable in IACs. It is widely (though not uniformly) acknowledged that the undertaking to ‘ensure respect’ in a given armed conflict extends to HCPs that are not parties to it; nevertheless, the precise scope of this undertaking is subject to scholarly debate. This article concerns situations where, in the course of an (international or non-international) armed conflict, persons ’taking no active part in hostilities’ flee from States where violations of CA3 are (likely to be) occurring to a non-belligerent State. Based on the undertaking in CA1, the central claim of this article is that, as long as risk of exposure to these violations persists, persons should not be refouled notwithstanding possible assessment of whether they qualify as refugees based on the 1951 Refugee Convention definition, or could be eligible for complementary or subsidiary forms of protection that are regulated in regional arrangements. The analysis does not affect the explicit protection from refoulement that the Fourth Geneva Convention accords to ‘protected persons’ (as defined in Article 4 thereof). It is submitted that CA1 should be read in tandem with other obligations of non-belligerent States under the 1949 Geneva Conventions. Most pertinently, all HCPs are required to take specific measures to repress ‘grave breaches’ and to take measures necessary for the suppression of all acts contrary to the 1949 Geneva Conventions other than the grave breaches. A HCP that is capable of protecting displaced persons from exposure to risks of violations of CA3 and nonetheless refoules them to face such risks is arguably failing to take lawful measures at its disposal in order to suppress acts contrary to the conventions and, consequently, fails to ‘ensure respect’ for the conventions. KEYWORDS Non-refoulement; International Armed Conflict; Non-International Armed Conflict; Common Article 1; Common Article 3
Resumo:
∆14Catm has been estimated as 420 ± 80‰ (IntCal09) during the Last Glacial Maximum (LGM) compared to preindustrial times (0‰), but mechanisms explaining this difference are not yet resolved. ∆14Catm is a function of both cosmogenic production in the high atmosphere and of carbon cycling and partitioning in the Earth system. 10Be-based reconstructions show a contribution of the cosmogenic production term of only 200 ± 200‰ in the LGM. The remaining 220‰ have thus to be explained by changes in the carbon cycle. Recently, Bouttes et al. (2010, 2011) proposed to explain most of the difference in pCO2atm and δ13C between glacial and interglacial times as a result of brine-induced ocean stratification in the Southern Ocean. This mechanism involves the formation of very saline water masses that contribute to high carbon storage in the deep ocean. During glacial times, the sinking of brines is enhanced and more carbon is stored in the deep ocean, lowering pCO2atm. Moreover, the sinking of brines induces increased stratification in the Southern Ocean, which keeps the deep ocean well isolated from the surface. Such an isolated ocean reservoir would be characterized by a low ∆14C signature. Evidence of such 14C-depleted deep waters during the LGM has recently been found in the Southern Ocean (Skinner et al. 2010). The degassing of this carbon with low ∆14C would then reduce ∆14Catm throughout the deglaciation. We have further developed the CLIMBER-2 model to include a cosmogenic production of 14C as well as an interactive atmospheric 14C reservoir. We investigate the role of both the sinking of brine and cosmogenic production, alongside iron fertilization mechanisms, to explain changes in ∆14Catm during the last deglaciation. In our simulations, not only is the sinking of brine mechanism consistent with past ∆14C data, but it also explains most of the differences in pCO2atm and ∆14Catm between the LGM and preindustrial times. Finally, this study represents the first time to our knowledge that a model experiment explains glacial-interglacial differences in pCO2atm, δ13C, and ∆14C together with a coherent LGM climate.
Resumo:
Single-column models (SCM) are useful test beds for investigating the parameterization schemes of numerical weather prediction and climate models. The usefulness of SCM simulations are limited, however, by the accuracy of the best estimate large-scale observations prescribed. Errors estimating the observations will result in uncertainty in modeled simulations. One method to address the modeled uncertainty is to simulate an ensemble where the ensemble members span observational uncertainty. This study first derives an ensemble of large-scale data for the Tropical Warm Pool International Cloud Experiment (TWP-ICE) based on an estimate of a possible source of error in the best estimate product. These data are then used to carry out simulations with 11 SCM and two cloud-resolving models (CRM). Best estimate simulations are also performed. All models show that moisture-related variables are close to observations and there are limited differences between the best estimate and ensemble mean values. The models, however, show different sensitivities to changes in the forcing particularly when weakly forced. The ensemble simulations highlight important differences in the surface evaporation term of the moisture budget between the SCM and CRM. Differences are also apparent between the models in the ensemble mean vertical structure of cloud variables, while for each model, cloud properties are relatively insensitive to forcing. The ensemble is further used to investigate cloud variables and precipitation and identifies differences between CRM and SCM particularly for relationships involving ice. This study highlights the additional analysis that can be performed using ensemble simulations and hence enables a more complete model investigation compared to using the more traditional single best estimate simulation only.
Resumo:
Intuition is an important and under-researched concept in information systems. Prior exploratory research has shown that that there is potential to characterize the use of intuition in academic information systems research. This paper extends this research to all of the available issues of two leading IS journals with the aim of reaching an approximation of theoretical saturation. Specifically, the entire text of MISQ and ISR was reviewed for the years 1990 through 2009 using searchable PDF versions of these publications. All references to intuition were coded on a basis consistent with Grounded Theory, interpreted as a gestalt and represented as a mind-map. In the period 1990-2009, 681 incidents of the use of "intuition", and related terms were found in the articles reviewed, representing a greater range of codes than prior research. In addition, codes were assigned to all issues of MIS Quarterly from commencement of publication to the end of the 2012 publication year to support the conjecture that coding saturation has been approximated. The most prominent use of the term of "intuition" was coded as "Intuition as Authority" in which intuition was used to validate a statement, research objective or a finding; representing approximately 34 per cent of codes assigned. In research articles where mathematical analysis was presented, researchers not infrequently commented on the degree to which a mathematical formulation was "intuitive"; this was the second most common coding representing approximately 16 per cent of the codes. The possibly most impactful use of the term "intuition" was "Intuition as Outcome", representing approximately 7 per cent of all coding, which characterized research results as adding to the intuitive understanding of a research topic or phenomena.This research aims to contribute to a greater theoretical understanding of the use of intuition in academic IS research publications. It provides potential benefits to practitioners by providing insight into the use of intuition in IS management, for example, emphasizing the emerging importance of "intuitive technology". Research directions include the creation of reflective and/or formative constructs for intuition in information systems research and the expansion of this novel research method to additional IS academic publications and topics.
Resumo:
The old scholastic principle of the "convertibility" of being and goodness strikes nearly all moderns as either barely comprehensible or plain false. "Convertible" is a term of art meaning "interchangeable" in respect of predication, where the predicates can be exchanged salva veritate albeit not salva sensu: their referents are, as the maxim goes, really the same albeit conceptually different. The principle seems, at first blush, absurd. Did the scholastics literally mean that every being is good? Is that supposed to include a cancer, a malaria parasite, an earthquake that kills millions? If every being is good, then no being is bad—but how can that be? To the contemporary philosophical mind, such bafflement is understandable. It derives from the systematic dismantling of the great scholastic edifice that took place over half a millennium. With the loss of the basic concepts out of which that edifice was built, the space created by those concepts faded out of existence as well. The convertibility principle, like virtually all the other scholastic principles (not all, since some do survive and thrive in analytic philosophy), could not persist in a post-scholastic space wholly alien to it.
Resumo:
The aim with this Essay is to examine the two most read magazines in Sweden covering the areas of exercising, fitness, bodybuilding, diets and “wellness” –Fitness and Body. Fitness’s target group is predominantly woman, while Body is almost exclusively read by men. The analysis is first done quantitatively, by systematically categorising the contents of the magazines. Then a qualitative analysis is made. Using two different theories, Anja Hirdman’s gender concept along with her constructivistic media perspective and the theory of Symbolic Interactionism, i try to answer the following questions; Does the two magazines term of address and language differ from one and other? And if that is the case, in what way? With point of departure from contents and subject areas, how are the two magazines compounded? How can the underlying message in the different articles be interpreted?The analysis shows that both magazines followed their purpose of writing about exercising in general, diet, fitness and bodybuilding. However, the magazine Fitness writes more often then Body about matters not following the given purpose, and the language in Body is more informative and general in comparison with the language used in Fitness. Still, the messages sent out by the different articles are in majority of the respects similar. Both magazines are portraying body ideals who can be understand as extreme. In Body the message feels fairly straight, “Build bigger muscles and burn more fat”, whereas Fitness willingly use the concept of “wellness” as a cover for what the message really is, namely “get your self a rock hard body through dieting and hard training”!
Resumo:
Most previous studies have focused on entire trips in a geographic region, while a few of them addressed trips induced by a city landmark. Therefore paper explores trips and their CO2 emissions induced by a shopping center from a time-space perspective and their usage in relocation planning. This is conducted by the means of a case study in the city of Borlänge in mid-Sweden where trips to the city’s largest shopping mall in its center are examined. We use GPS tracking data of car trips that end and start at the shopping center. Thereafter, (1) we analyze the traffic emission patterns from a time-space perspective where temporal patterns reveal an hourly-based traffic emission dynamics and where spatial patterns uncover a heterogeneous distribution of traffic emissions in spatial areas and individual street segments. Further, (2) this study reports that most of the observed trips follow an optimal route in terms of CO2 emissions. In this respect, (3) we evaluate how well placed the current shopping center is through a comparison with two competing locations. We conclude that the two suggested locations, which are close to the current shopping center, do not show a significant improvement in term of CO2 emissions.
Resumo:
Background. Nurses' research utilization (RU) as part of evidence-based practice is strongly emphasized in today's nursing education and clinical practice. The primary aim of RU is to provide high-quality nursing care to patients. Data on newly graduated nurses' RU are scarce, but a predominance of low use has been reported in recent studies. Factors associated with nurses' RU have previously been identified among individual and organizational/contextual factors, but there is a lack of knowledge about how these factors, including educational ones, interact with each other and with RU, particularly in nurses during the first years after graduation. The purpose of this study was therefore to identify factors that predict the probability for low RU among registered nurses two years after graduation. Methods. Data were collected as part of the LANE study (Longitudinal Analysis of Nursing Education), a Swedish national survey of nursing students and registered nurses. Data on nurses' instrumental, conceptual, and persuasive RU were collected two years after graduation (2007, n = 845), together with data on work contextual factors. Data on individual and educational factors were collected in the first year (2002) and last term of education (2004). Guided by an analytic schedule, bivariate analyses, followed by logistic regression modeling, were applied. Results. Of the variables associated with RU in the bivariate analyses, six were found to be significantly related to low RU in the final logistic regression model: work in the psychiatric setting, role ambiguity, sufficient staffing, low work challenge, being male, and low student activity. Conclusions. A number of factors associated with nurses' low extent of RU two years postgraduation were found, most of them potentially modifiable. These findings illustrate the multitude of factors related to low RU extent and take their interrelationships into account. This knowledge might serve as useful input in planning future studies aiming to improve nurses', specifically newly graduated nurses', RU.
Resumo:
Este trabalho tem por objetivo desenvolver e implementar, computacionalmente, procedimentos numéricos eficientes, aplicados à determinacão do diagrama momento-curvatura, correspondentes à: -uma seção tipica, em vigas de concreto armado, submetida à carga monotônica ou cíclica de curta duração; - um ponto genérico da superficie média em placas de concreto armado, submetidas à carga monotônica de curta duração. Ainda à luz dos resultados obtidos, visa também propôr um modelo simplificado em termos de resultantes de tensões e deformações generalizadas. Inicialmente, é descrito um modelo laminar para vigas, no qual a carga é aplicada de forma incremental sendo que para cada etapa, as equaçães de equilibrio não-lineares são resolvidas de maneira iterativa. Como consequência é proposta uma relação momento-curvatura em termos de resultantes. A fim de verificar a validade e aplicabilidade dos métodos e dos algorítmos estudados e comparar-se os resultados com dados experimentais e respostas obtidas por outros pesquisadores, é apresentada uma série de exemplos numéricos. A continuação, é aplicado o procedimento anterior para modelos de laje, livres de solicitações de membrana. Finalmente através de um estudo paramétrico dos diversos fatores que afetam o diagrama momento-curvatura, propõe-se uma relação simplificada.
Resumo:
The structure of protection across sectors is usually interpreted as the result of competition among lobbies to influence politicians, but little attention has been devoted to the importance of individual firms in this process. This paper builds a model incorporating firm heterogeneity into a lobbying setup `a la Grossman and Helpman (1994), in a monopolistic competitive environment. We obtain that increased sectorial dispersion cause a fall in equilibrium tariff provided that the exporter’s cutoff is above the mean of the distribution. Also, higher average productivity brings about a fall in the equilibrium tariff, whereas an increase in export costs cause an increase in the tariff. JEL Classification codes: D43, D7, F12, F13, L11
Resumo:
The WTO established two rules concerning the international protection of the TRIPs - trade related intellectual property rights, which includes patents and copyrights. One of these rules is the non-discrimination, which has shown to be efficiency-enhancing in the context of trade tariff reductions. The other is the national-treatment commitment rule. We develop in this paper a simple framework to show that the extended version of this rule - which is nowadays being imposed to members - brings out a loss of economic efficiency and a reduction in the levels of protection of intellectual property rights worldwide. As a consequence, it tends to reduce the investments on Research and Development throughout the world. This exactly contradicts the objectives of the Agreement.
Resumo:
The objective of this work is the study of the existing correlations between the strategical use of the information and the joint and implementation of defense politics and national security in the Legal Amazonian. For in such a way, the proposal was developed from the analysis of the systems of protection and monitoring of the Amazonian (SIPAM/SIVAM), where we search to inquire as these systems have contributed for the definition and implantation of these politics. For the Amazonian, with its natural wealth, threats and vulnerabilities, the perspectives of integration, security and national defense and of sustainable development constitute great challenges to be faced, where the efficient use of the technology is a basic reference that must be incorporated in the strategies and public politics in these areas. One is about a strategical project, conceived with vision of future, protection and development of the . The objective SIPAM/SIVAM the defense and the guarantee of the Brazilian sovereignty in the Legal Amazonian, beyond the systematization and accomplishment of the governmental actions in the region, by means of the intensive use of technological apparatus. In turn, they reflect the priority that the Amazon region has in terms of defense and security for the Country, and symbolize the strategy of the State to protect it. The SIPAM/SIVAM if finds in a boarding line in which the guarantee of the national sovereignty also involves the care with the development of the local population, inside of a proposal educative and integrator. Like conclusion we affirm that of the SIPAM/SIVAM creates a new paradigm for the public administration, where the organizations work with a shared set of information, beyond starting to act of integrated form. Thus, when searching permanently the rationalization of efforts and resources, trying an unknown form of institution relationship where infrastructure and products are shared, the SIPAM/SIVAM creates a new premise for the Brazilian public administration and contributes to give a new direction to the development of the Amazonian.
Resumo:
O presente trabalho analisa a formulação das políticas culturais no Brasil a partir da análise de dois casos bastante distintos: as leis de incentivo, formuladas no início da década de 1990, na esteira do neoliberalismo, e o Programa Cultura Viva, formulado no ano de 2004, no primeiro mandato do Presidente Lula. A partir da análise detalhada do contexto de formulação de cada uma das políticas culturais, bem como dos públicos efetivamente atendidos e dos valores disponibilizados, mostramos tratarse de duas formas de políticas culturais que apontam para diferentes horizontes em termos de cidadania cultural. Na questão das leis de incentivo, analisamos a passagem do modelo fordista de acumulação para a acumulação flexível, relacionando a importância das estratégias de branding para as novas formas da cultura do consumo. No caso do Programa Cultura Viva, analisamos quais os grupos privilegiados, delimitando os alcances e limites dessa política. Em nossa abordagem, apoiamo-nos no referencial gramsciano de hegemonia, relacionando-a fortemente com a cultura numa sociedade de classes. Dada à singularidade do conceito de sociedade civil na abordagem do pensador italiano, além da evidente relevância que essa esfera assume com o ideário neoliberal, faz-se necessário uma análise histórica de sua evolução, na busca de evidências que apontem para uma política emancipatória a partir das ações nessa esfera, e no seu relacionamento com o Estado e o mercado.
Resumo:
Brazil has a substantial share – about 60% by some measures - of its employees working without labor registry and 62% of its private sector workers not contributing to social security. Informality is important because its job precaurioness, social desprotection consequences, and it is also very correlated with poverty and other social welfare concepts measured at a family level. 58% of the country population that is found below the indigent line live in families headed by informal workers. The complexity of the informal sector is derived from the multiple relevant dimensions of jobs quality. The basis used for guiding policy interventions depends on which effect of informality one is interested such: as lowering job precaurioness, increasing occupational risks, increasing the degree of protection against adverse shocks, allowing that good oportunities to be taken by the credit provision, improving informal workers families living conditions, implementing afirmative actions, reducing tax evasion etc. This report gauges various aspects of the informal sector activities in Brazil over the last decades. Our artistic constraint are the available sources of information. The final purpose is to help the design of policies aimed to assist those that hold “indecent” jobs