890 resultados para Justification of right to know under freedom of speech doctrine
Resumo:
A novel method is presented for obtaining rigorous upper bounds on the finite-amplitude growth of instabilities to parallel shear flows on the beta-plane. The method relies on the existence of finite-amplitude Liapunov (normed) stability theorems, due to Arnol'd, which are nonlinear generalizations of the classical stability theorems of Rayleigh and Fjørtoft. Briefly, the idea is to use the finite-amplitude stability theorems to constrain the evolution of unstable flows in terms of their proximity to a stable flow. Two classes of general bounds are derived, and various examples are considered. It is also shown that, for a certain kind of forced-dissipative problem with dissipation proportional to vorticity, the finite-amplitude stability theorems (which were originally derived for inviscid, unforced flow) remain valid (though they are no longer strictly Liapunov); the saturation bounds therefore continue to hold under these conditions.
Resumo:
This paper presents a preliminary assessment of the relative effects of rate of climate change (four Representative Concentration Pathways - RCPs), assumed future population (five Shared Socio-economic Pathways - SSPs), and pattern of climate change (19 CMIP5 climate models) on regional and global exposure to water resources stress and river flooding. Uncertainty in projected future impacts of climate change on exposure to water stress and river flooding is dominated by uncertainty in the projected spatial and seasonal pattern of change in climate. There is little clear difference in impact between RCP2.6, RCP4.5 and RCP6.0 in 2050, and between RCP4.5 and RCP6.0 in 2080. Impacts under RCP8.5 are greater than under the other RCPs in 2050 and 2080. For a given RCP, there is a difference in the absolute numbers of people exposed to increased water resources stress or increased river flood frequency between the five SSPs. With the ‘middle-of-the-road’ SSP2, climate change by 2050 would increase exposure to water resources stress for between approximately 920 and 3400 million people under the highest RCP, and increase exposure to river flood risk for between 100 and 580 million people. Under RCP2.6, exposure to increased water scarcity would be reduced in 2050 by 22-24%, compared to impacts under the RCP8.5, and exposure to increased flood frequency would be reduced by around 16%. The implications of climate change for actual future losses and adaptation depend not only on the numbers of people exposed to changes in risk, but also on the qualitative characteristics of future worlds as described in the different SSPs. The difference in ‘actual’ impact between SSPs will therefore be greater than the differences in numbers of people exposed to impact.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Social equality in the number of choice options is represented in the ventromedial prefrontal cortex
Resumo:
A distinct aspect of the sense of fairness in humans is that we care not only about equality in material rewards but also about equality in non-material values. One such value is the opportunity to choose freely among many options, often regarded as a fundamental right to economic freedom. In modern developed societies, equal opportunities in work, living, and lifestyle are enforced by anti-discrimination laws. Despite the widespread endorsement of equal opportunity, no studies have explored how people assign value to it. We used functional magnetic resonance imaging to identify the neural substrates for subjective valuation of equality in choice opportunity. Participants performed a two-person choice task in which the number of choices available was varied across trials independently of choice outcomes. By using this procedure, we manipulated the degree of equality in choice opportunity between players and dissociated it from the value of reward outcomes and their equality. We found that activation in the ventromedial prefrontal cortex tracked the degree to which the number of options between the two players was equal. In contrast, activation in the ventral striatum tracked the number of options available to participants themselves but not the equality between players. Our results demonstrate that the vmPFC, a key brain region previously implicated in the processing of social values, is also involved in valuation of equality in choice opportunity between individuals. These findings may provide valuable insight into the human ability to value equal opportunity, a characteristic long emphasized in politics, economics, and philosophy.
Resumo:
This is the second half of a two-part paper dealing with the social theoretic assumptions underlying system dynamics. In the first half it was concluded that analysing system dynamics using traditional, paradigm-based social theories is highly problematic. An innovative and potentially fruitful resolution is now proposed to these problems. In the first section it is argued that in order to find an appropriate social theoretic home for system dynamics it is necessary to look to a key exchange in contemporary social science: the agency/structure debate. This debate aims to move beyond both the theories based only on the actions of individual human agents, and those theories that emphasise only structural influences. Emerging from this debate are various theories that instead aim to unite the human agent view of the social realm with views that concentrate solely on system structure. It is argued that system dynamics is best viewed as being implicitly grounded in such theories. The main conclusion is therefore that system dynamics can contribute to an important part of social thinking by providing a formal approach for explicating social mechanisms. This conclusion is of general significance for system dynamics. However, the over-arching aim of the two-part paper is to increase the understanding of system dynamics in related disciplines. Four suggestions are therefore offered for how the system dynamics method might be extended further into the social sciences. It is argued that, presented in the right way, the formal yet contingent feedback causality thinking of system dynamics should diffuse widely in the social sciences and make a distinctive and important contribution to them. Felix qui potuit rerum cognoscere causas Happy is he who comes to know the causes of things Virgil - Georgics, Book II, line 490. 29 BCE
Resumo:
This article is concerned with the liability of search engines for algorithmically produced search suggestions, such as through Google’s ‘autocomplete’ function. Liability in this context may arise when automatically generated associations have an offensive or defamatory meaning, or may even induce infringement of intellectual property rights. The increasing number of cases that have been brought before courts all over the world puts forward questions on the conflict of fundamental freedoms of speech and access to information on the one hand, and personality rights of individuals— under a broader right of informational self-determination—on the other. In the light of the recent judgment of the Court of Justice of the European Union (EU) in Google Spain v AEPD, this article concludes that many requests for removal of suggestions including private individuals’ information will be successful on the basis of EU data protection law, even absent prejudice to the person concerned.
Resumo:
In Andrea Sangiovanni’s words, practice-dependent theorists hold that “[t]he content, scope, and justification of a conception of [a given value] depends on the structure and form of the practices that the conception is intended to govern”. They have tended to present this as methodologically innovative, but here I point to the similarities between the methodological commitments of contemporary practice-dependent theorists and others, particularly P. F. Strawson in his Freedom and Resentment and Bernard Williams in general. I suggest that by looking at what Strawson and Williams did, we can add to the reasons for adopting one form or another of practice-dependence. The internal complexity of the practices we hope our principles will govern may require it. However, this defence of practice-dependence also puts pressure on self-identified practice-dependence theorists, suggesting that they need to do more work to justify the interpretations of the practices their theories rely on.
Resumo:
Background Serotonin is under-researched in attention deficit hyperactivity disorder (ADHD), despite accumulating evidence for its involvement in impulsiveness and the disorder. Serotonin further modulates temporal discounting (TD), which is typically abnormal in ADHD relative to healthy subjects, underpinned by reduced fronto-striato-limbic activation. This study tested whether a single acute dose of the selective serotonin reuptake inhibitor (SSRI) fluoxetine up-regulates and normalizes reduced fronto-striato-limbic neurofunctional activation in ADHD during TD. Method Twelve boys with ADHD were scanned twice in a placebo-controlled randomized design under either fluoxetine (between 8 and 15 mg, titrated to weight) or placebo while performing an individually adjusted functional magnetic resonance imaging TD task. Twenty healthy controls were scanned once. Brain activation was compared in patients under either drug condition and compared to controls to test for normalization effects. Results Repeated-measures whole-brain analysis in patients revealed significant up-regulation with fluoxetine in a large cluster comprising right inferior frontal cortex, insula, premotor cortex and basal ganglia, which further correlated trend-wise with TD performance, which was impaired relative to controls under placebo, but normalized under fluoxetine. Fluoxetine further down-regulated default mode areas of posterior cingulate and precuneus. Comparisons between controls and patients under either drug condition revealed normalization with fluoxetine in right premotor-insular-parietal activation, which was reduced in patients under placebo. Conclusions The findings show that a serotonin agonist up-regulates activation in typical ADHD dysfunctional areas in right inferior frontal cortex, insula and striatum as well as down-regulating default mode network regions in the context of impulsivity and TD.
Resumo:
PhotogemA (R) is a hematoporphyrin derivative that has been used as a photosensitizer in experimental and clinical Photodynamic Therapy (PDT) in Brazil. Photosensitizers are degraded under illumination. This process, usually called photobleaching, can be monitored by decreasing in fluorescence intensities and includes the following photoprocesses: photodegradation, phototransformation, and photorelocalization. Photobleaching of hematoporphyrin-type sensitizers during illumination in aqueous solution is related not only to photodegradation but is also followed by the formation of photoproducts with a new fluorescence band at around 640-650 nm and with increased light absorption in the red spectral region at 640 nm. In this study, the influence of pH on the phototransformation process was investigated. PhotogemA (R) solutions, 40 mu g/ml, were irradiated at 514 nm with intensity of 100 mW/cm(2) for 20 min with different pH environments. The controls were performed with the samples in the absence of light. The PhotogemA (R) photodegradation is dependent on the pH. The behavior of photodegradation and photoproducts formation (monitored at 640 nm) is distinct and depends on the photosensitizer concentration. The processes of degradation and photoproducts formation were monitored with Photogemin the concentration of 40 mu g/mL since that demonstrated the best visualization of both processes. While below pH 5 the photodegradation occurred, there was no detectable presence of photoproducts. The increase of pH led to increase of photoproducts formation rate with photodegradation reaching the highest value at pH 10. The increase of photoproducts formation and instability of PhotogemA (R) from pH 6 to pH 10 are in agreement with the desired properties of an ideal photosensitizer since there are significant differences in pH between normal (7.0 < pH < 8.6) and tumor (5.8 < pH < 7.9) tissues. It is important to know the effect of pH in the process of phototransformation (degradation and photoproduct formation) of the molecule since low pH values promotes increase in the proportion of aggregates species in solution and high pH values promotes increase in the proportion of monomeric species. There must be an ideal pH interval which favors the phototransformation process that is correlated with the singlet oxygen formation responsible by the photodynamic effect. These differences in pH between normal and tumor cells can explain the presence of photosensitizers in target tumor cells, making PDT a selective therapy.
Resumo:
The hydrogenation of benzene and benzene derivatives was studied using Ru(0) nanoparticles prepared by a very simple method based on the in situ reduction of the commercially available precursor ruthenium dioxide under mild conditions (75 degrees C and hydrogen pressure 4atm) in imidazolium ionic liquids. Total turnovers (TTO) of 2700 mol/mol Ru were obtained for the conversion of benzene to cyclohexane under solventless conditions and TTO of 1200 mol/mol Ru were observed under ionic liquid biphasic conditions. When corrected for exposed ruthenium atoms, TTO values of 7940 (solventless) and 3530 (biphasic) were calculated for benzene hydrogenation. These reaction rates are higher than those observed for Ru nanoparticles prepared from decomposition of an organometallic precursor in similar conditions. The presence of the partially hydrogenated product cyclohexene was also detected at low conversion rates. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Cytochrome P450 (CYP450) is a class of enzymes where the substrate identification is particularly important to know. It would help medicinal chemists to design drugs with lower side effects due to drug-drug interactions and to extensive genetic polymorphism. Herein, we discuss the application of the 2D and 3D-similarity searches in identifying reference Structures with higher capacity to retrieve Substrates of three important CYP enzymes (CYP2C9, CYP2D6, and CYP3A4). On the basis of the complementarities of multiple reference structures selected by different similarity search methods, we proposed the fusion of their individual Tanimoto scores into a consensus Tanimoto score (T(consensus)). Using this new score, true positive rates of 63% (CYP2C9) and 81% (CYP2D6) were achieved with false positive rates of 4% for the CYP2C9-CYP2D6 data Set. Extended similarity searches were carried out oil a validation data set, and the results showed that by using the T(consensus) score, not only the area of a ROC graph increased, but also more substrates were recovered at the beginning of a ranked list.
Resumo:
The thesis focuses on, and tries to evaluate, the role that the African Union (AU) plays in protecting the peace and security on the African continent. The thesis takes an interdisciplinary approach to the topic by both utilizing international relations and international law theories. The two disciplines are combined in an attempt to understand the evolution of the AU’s commitment to the pragmatist doctrine: responsibility to protect (R2P). The AU charter is considered to be the first international law document to cover R2P as it allows the AU to interfere in the internal affairs of its member states. The R2P doctrine was evolved around the notion of a need to arrive at a consensus in regard to the right to intervene in the face of humanitarian emergencies. A part of the post-Cold War shift in UN behaviour has been to support local solutions to local problems. Hereby the UN acts in collaboration with regional organizations, such as the AU, to achieve the shared aspirations to maintain international peace and security without getting directly involved on the ground. The R2P takes a more holistic and long-term approach to interventions by including an awareness of the need to address the root causes of the crisis in order to prevent future resurrections of conflicts. The doctrine also acknowledges the responsibility of the international community and the intervening parties to actively participate in the rebuilding of the post-conflict state. This requires sustained and well planned support to ensure the development of a stable society.While the AU is committed to implementing R2P, many of the AU’s members are struggling, both ideologically and practically, to uphold the foundations on which legitimate intervention rests, such as the protection of human rights and good governance. The fact that many members are also among the poorest countries in the world adds to the challenges facing the AU. A lack of human and material resources leads to a situation where few countries are willing, or able, to support a long-term commitment to humanitarian interventions. Bad planning and unclear mandates also limit the effectiveness of the interventions. This leaves the AU strongly dependent on regional powerbrokers such as Nigeria and South Africa, which in itself creates new problems in regard to the motivations behind interventions. The current AU charter does not provide sufficient checks and balances to ensure that national interests are not furthered through humanitarian interventions. The lack of resources within the AU also generates worries over what pressure foreign nations and other international actors apply through donor funding. It is impossible for the principle of “local solutions for local problems? to gain ground while this donor conditionality exists.The future of the AU peace and security regime is not established since it still is a work in progress. The direction that these developments will take depends on a wide verity of factors, many of which are beyond the immediate control of the AU.
Resumo:
The core concepts of CA In the theoretical framework of CA, well-being is constituted by a person’s unique way of functioning and capabilities. This means that a person's well-being is personal and involves freedom of choice which in turn means they have a number of options. Although many people may have the same resources, it is of importance to study how these resources are converted into how they function. Thus, wellbeing is about the person's freedom to achieve in general and the capabilities to function in particular (Sen, 1995). Strength of the capability approach The capability approach is a useful tool for matching objective evaluations with subjective metrics. Furthermore, although one’s individual abilities are in focus, contextual factors, and subjective perceptions and experiences, are taken into consideration. Critiques against the CA The capability approach has been criticized for being too individual-centered and not taking sufficient account to social structures in society. It is difficult to know what a person would choose to do if other options were available. Therefore, to operationalize abilities involves uncertainties.
Resumo:
This research aimed to find out which are the main factors that lead technology startups to fail. The study focused on companies located in the Southeast region of Brazil that operated between 2009 and 2014. In the beginning, a review of the literature was done to have a better understanding of basic concepts of entrepreneurship as well as modern techniques for developing entrepreneurship. Furthermore, an analysis of the entrepreneurial scenario in Brazil, with a focus on the Southeast, was also done. After this phase, the qualitative study began, in which 24 specialists from startups were interviewed and asked about which factors were crucial in leading a technology startup to fail. After analyzing the results, four main factors were identified and these factors were validated through a quantitative survey. A questionnaire was then formulated based on the answers from the respondents and distributed to founders and executives of startups, which both failed and succeeded. The questionnaire was answered by 56 companies and their answers were treated with the factor analysis statistical method to check the validity of the questionnaire. Finally, the logistical regression method was used to know the extent to which the factors led to the startups’ failure. In the end, the results obtained suggest that the most significant factor that leads technology startups in southeastern Brazil to fail are problems with interpersonal relationship between partners or investors.