752 resultados para Scientists in government
Resumo:
The purpose of this thesis is to investigate whether some positions in democratic theory should be adjusted or abandoned in view of internationalisation; and if adjusted, how. More specifically it pursues three different aims: to evaluate various attempts to explain levels of democracy as consequences of internationalisation; to investigate whether the taking into account of internationalisation reveals any reason to reconsider what democracy is or means; and to suggest normative interpretations that cohere with the adjustments of conceptual and explanatory democratic theory made in the course of meeting the other two aims. When empirical methods are used, the scope of the study is restricted to West European parliamentary democracies and their international affairs. More particularly, the focus is on the making of budget policy in Britain, France, and Sweden after the Second World War, and recent budget policy in the European Union. The aspects of democracy empirically analysed are political autonomy, participation, and deliberation. The material considered includes parliamentary debates, official statistics, economic forecasts, elections manifestos, shadow budgets, general election turnouts, regulations of budget decision-making, and staff numbers in government and parliament budgetary divisions. The study reaches the following conclusions among others. (i) The fact that internationalisation increases the divergence between those who make and those who are affected by decisions is not by itself a democratic problem that calls for political reform. (ii) That international organisations may have authorities delegated to them from democratic states is not sufficient to justify them democratically. Democratisation still needs to be undertaken. (iii) The fear that internationalisation dissolves a social trust necessary for political deliberation within nations seems to be unwarranted. If anything, views argued by others in domestic budgetary debate are taken increasingly serious during internationalisation. (iv) The major difficulty with deliberation seems to be its inability to transcend national boundaries. International deliberation at state level has not evolved in response to internationalisation and it is undeveloped in international institutions. (v) Democratic political autonomy diminishes during internationalisation with regard to income redistribution and policy areas taken over by international organisations, but it seems to increase in public spending. (vi) In the area of budget policy-making there are no signs that governments gain power at the expense of parliaments during internationalisation. (vii) To identify crucial democratic issues in a time of internationalisation and to make room for theoretical virtues like general applicability and normative fruitfulness, democracy may be defined as a kind of politics where as many as possible decide as much as possible.
Resumo:
Máster en Gestión Sostenible de Recursos Pesqueros
Resumo:
Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.
Resumo:
The advances in computational biology have made simultaneous monitoring of thousands of features possible. The high throughput technologies not only bring about a much richer information context in which to study various aspects of gene functions but they also present challenge of analyzing data with large number of covariates and few samples. As an integral part of machine learning, classification of samples into two or more categories is almost always of interest to scientists. In this paper, we address the question of classification in this setting by extending partial least squares (PLS), a popular dimension reduction tool in chemometrics, in the context of generalized linear regression based on a previous approach, Iteratively ReWeighted Partial Least Squares, i.e. IRWPLS (Marx, 1996). We compare our results with two-stage PLS (Nguyen and Rocke, 2002A; Nguyen and Rocke, 2002B) and other classifiers. We show that by phrasing the problem in a generalized linear model setting and by applying bias correction to the likelihood to avoid (quasi)separation, we often get lower classification error rates.
Resumo:
The increasing deployment of mobile communication base stations led to an increasing demand for epidemiological studies on possible health effects of radio frequency emissions. The methodological challenges of such studies have been critically evaluated by a panel of scientists in the fields of radiofrequency engineering/dosimetry and epidemiology. Strengths and weaknesses of previous studies have been identified. Dosimetric concepts and crucial aspects in exposure assessment were evaluated in terms of epidemiological studies on different types of outcomes. We conclude that in principle base station epidemiological studies are feasible. However, the exposure contributions from all relevant radio frequency sources have to be taken into account. The applied exposure assessment method should be piloted and validated. Short to medium term effects on physiology or health related quality of life are best investigated by cohort studies. For long term effects, groups with a potential for high exposure need to first be identified; for immediate effect, human laboratory studies are the preferred approach.
Resumo:
As environmental problems became more complex, policy and regulatory decisions become far more difficult to make. The use of science has become an important practice in the decision making process of many federal agencies. Many different types of scientific information are used to make decisions within the EPA, with computer models becoming especially important. Environmental models are used throughout the EPA in a variety of contexts and their predictive capacity has become highly valued in decision making. The main focus of this research is to examine the EPA’s Council for Regulatory Modeling (CREM) as a case study in addressing science issues, particularly models, in government agencies. Specifically, the goal was to answer the following questions: What is the history of the CREM and how can this information shed light on the process of science policy implementation? What were the goals of implementing the CREM? Were these goals reached and how have they changed? What have been the impediments that the CREM has faced and why did these impediments occur? The three main sources of information for this research came from observations during summer employment with the CREM, document review and supplemental interviews with CREM participants and other members of the modeling community. Examining a history of modeling at the EPA, as well as a history of the CREM, provides insight into the many challenges that are faced when implementing science policy and science policy programs. After examining the many impediments that the CREM has faced in implementing modeling policies, it was clear that the impediments fall into two separate categories, classic and paradoxical. The classic impediments include the more standard impediments to science policy implementation that might be found in any regulatory environment, such as lack of resources and changes in administration. Paradoxical impediments are cyclical in nature, with no clear solution, such as balancing top-down versus bottom-up initiatives and coping with differing perceptions. These impediments, when not properly addressed, severely hinder the ability for organizations to successfully implement science policy.
Resumo:
Community educators have long known the value of direct experience in the learning process. Participatory action research extends this philosophy to the realm of research. This article examines the value of involving front line camp staff, members of the camp community in Appalachia as practitioner researchers with university scientists in studying the type and conditions of transformative learning in young adult camp staff. A young adult who was a camp community member assisted the researchers with methodology, data analysis, data interpretation, and dissemination of findings. This resulted in a more accurate, richer, and thicker description of the camp community member’s transformative learning experience. The benefits of involving practitioner researchers are examined, as well as promising practices for conducting participatory action research in community education environments.
Resumo:
This article examines the issue of climate change in the context of ecocriticism. It analyzes some of the narrative forms employed in the mediation of climate change science, focusing on those used by mediators who are not themselves scientists in the transmission of scientific information to a nonspecialist readership or audience. It reviews four relevant works that combine the communication of scientific theories and facts with pedagogical and motivational impulses. These include David Guggenheimer’s documentary film An Inconvenient Truth, Fred Pearce’s book The Last Generation: How Nature Will Take Her Revenge for Climate Change and the climate change manuals The Live Earth Global Warming Survival Handbook and How to Save the Climate.
Resumo:
This is the first coherent description of all levels of communication of ciliates. Ciliates are highly sensitive organisms that actively compete for environmental resources. They assess their surroundings, estimate how much energy they need for particular goals, and then realise the optimum variant. They take measures to control certain environmental resources. They perceive themselves and can distinguish between ‘self’ and ‘non-self’. They process and evaluate information and then modify their behaviour accordingly. These highly diverse competences show us that this is possible owing to sign(aling)-mediated communication processes within ciliates (intra-organismic), between the same, related and different ciliate species (inter-organismic), and between ciliates and non-ciliate organisms (trans-organismic). This is crucial in coordinating growth and development, shape and dynamics. This book further serves as a learning tool for research aspects in biocommunication in ciliates. It will guide scientists in further investigations on ciliate behavior, how they mediate signaling processes between themselves and the environment.
Resumo:
The premise of this study is that changes in the agency's organizational structure reflect changes in government public health policy. Based on this premise, this study tracks the changes in the organizational structure and the overall expansion of the Texas Department of Health to understand the evolution of changing public health priorities in state policy from September 1, 1946 through June 30, 1994, a period of growth and new responsibilities. It includes thirty-seven observations of organizational structure as depicted by organizational charts of the agency and/or adapted from public documents. ^ The major questions answered are, what are the changes in the organizational structure, why did they occur and, what are the policy priorities reflected in these changes in and across the various time periods. ^ The analysis of the study included a thorough review of the organizational structure of the agency for the time-span of the study, the formulation of the criteria to be used in ascertaining the changes, the delineation of the changes in the organizational structure and comparison of the observations sequentially to characterize the change, the discovery of reasons for the structural changes (financial, statutory - federal and state, social and political factors), and the determination of policy priorities for each time period and their relation to the expansion and evolution of the agency. ^ The premise that the organizational structure of the agency and the changes over time reflect government public health policy and agency expansion was found to be true. ^
Resumo:
The most important tool in Germany's polar research program is the research and supply vessel Polarstern. The ship was commissioned in 1982, the maiden voyage started at the end of 1982. The owner of the ship is the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, Germany. Within the last 25 years Polarstern performed a total of 44 expeditions to the Arctic and Antarctic. The ship is well equipped for meteorological research as well as for routine meteorological services. The meteorological office is permanently manned with a weather technician/- observer from the German Weather Service (DWD) who performs the routine 3-hourly synoptic observations and the daily upper air soundings. Additionally, a weather forecaster is responsible to advice the ships captain as well as the helicopter pilots and all scientists in any weather related question. The forecaster gets assistance from the weather technician who performs the satellite picture reception and manages the near real time data flow.
Resumo:
Mobile and wireless communications systems have become an important part of our everyday lives. These ubiquitous technologies have a profound effect on how we live. People predict bright future to wireless technologies, but it wouldn’t be possible without a hard work of thousands of scientists in the wireless innovation research arena. My Marie Curie project is investigating enabling technologies for future mobile and wireless communications systems
Resumo:
The algorithms and graphic user interface software package ?OPT-PROx? are developed to meet food engineering needs related to canned food thermal processing simulation and optimization. The adaptive random search algorithm and its modification coupled with penalty function?s approach, and the finite difference methods with cubic spline approximation are utilized by ?OPT-PROx? package (http://tomakechoice. com/optprox/index.html). The diversity of thermal food processing optimization problems with different objectives and required constraints are solvable by developed software. The geometries supported by the ?OPT-PROx? are the following: (1) cylinder, (2) rectangle, (3) sphere. The mean square error minimization principle is utilized in order to estimate the heat transfer coefficient of food to be heated under optimal condition. The developed user friendly dialogue and used numerical procedures makes the ?OPT-PROx? software useful to food scientists in research and education, as well as to engineers involved in optimization of thermal food processing.
Resumo:
On 22nd February '96, the space mission STS 75 started ,from the NASA facilities at Cape Canaveral. Such a mission consists in the launch of the shuttle Columbia in order to carry out two experiments in the space: the TSS 1R (Tethered Satellite Sistem 1 Refliight) and the USMP (United States Microgravity Payload). The TSS 1R is a replica of a similar mission TSS 1 '92. The TSS space programme is a bilateral scientific cooperation between the USA space agency NASA (National Aeronautics and Space Agency) and the ASI (Italian Space Agency. The TSS 1R system consists on the shuttle Columbia which deploys, up-ward, by means a conducting tether 20 km long, a spherical satellite (1.5 mt diameter) containing scientific instrumentation. This system, orbiting at about 300 km from the Earth's surface, represents, presently, the largest experimental space structure, Due to its dimensions, flexibility and conducting properties of the tether, the system interacts, in a quite complex manner, wih the earth magnetic field and the ionospheric plasma, in a way that the total system behaves as an electromagnetic radiating antenna as well as an electric power generator. Twelve scientific experiments have been assessed by US and Italian scientists in order to study the electro dynamic behaviour of the structure orbiting in the ionos phere. Two experiments have been prepared in the attempt to receive on the Earth's surface possible electromagnetic events radiated by the TSS 1R. The project EMET (Electro Magnetic Emissions from Tether),USA and the project OESEE (Observations on the Earth Surface of Electromagnetic Emissions) Italy, consist in a coordinated programme of passive detection of such possible EM emissions. This detection will supply the verification of some thoretical hypotheses on the electrodynamic interactions between the orbiting system, the Earth's magnetic field and the ionospheric plasma with two principal aims as the technological assesment of the system concept as well as a deeper knowledge of the ionosphere properties for future space applications. A theoretical model that keeps the peculiarities of tether emissionsis being developed for signal prediction at constant tether current. As a step previous to the calculation of the expected ground signal , the Alfven-wave signature left by the tether far back in the ionosphere has been determined. The scientific expectations from the combined effort to measure the entity of those perturbations will be outlined taking in to account the used ground track sensor systems.
Resumo:
Recommender systems in e-learning have proved to be powerful tools to find suitable educational material during the learning experience. But traditional user request-response patterns are still being used to generate these recommendations. By including contextual information derived from the use of ubiquitous learning environments, the possibility of incorporating proactivity to the recommendation process has arisen. In this paper we describe methods to push proactive recommendations to e-learning systems users when the situation is appropriate without being needed their explicit request. As a result, interesting learning objects can be recommended attending to the user?s needs in every situation. The impact of this proactive recommendations generated have been evaluated among teachers and scientists in a real e-learning social network called Virtual Science Hub related to the GLOBAL excursion European project. Outcomes indicate that the methods proposed are valid to generate such kind of recommendations in e-learning scenarios. The results also show that the users' perceived appropriateness of having proactive recommendations is high.