853 resultados para Truth and Reconciliation Commissions


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This manual presents geographic information by state of occurrence, and descriptions of the socio-economic impact created by the invasion of non-indigenous and native transplanted animal species in the Laurentian Great Lakes and the coastal waters of the United States. It is not a comprehensive literature review, but rather is intended as a primer for those unfamiliar with the socio-economic impacts of invasive aquatic and marine animals. Readers should also note that the information contained in this manual is current as of its publication date. New information and new species are routinely being added to the wider literature base. Most of the information was gathered from a number of web sites maintained by government agencies, commissions, academic institutions and museums. Additional information was taken from the primary and secondary literature. This manual focuses on socio-economic consequences of invasive species. Thus, ecological impacts, when noted in the literature, are not discussed unless a connection to socio-economic factors can be made. For a majority of the species listed, either the impact of their invasion is not understood, or it is not published in sources surveyed. In the species summaries, sources of information are cited except for information from the U.S. Geological Survey’s (USGS) Nonindigenous Aquatic Species Database http://nas.er.usgs.gov. This website formed the base information used in creating tables on geographic distribution, and in many of the species summaries provided. Thus, whenever information is given without specific author/source and date citation, it has come from this comprehensive source. (PDF contains 90 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The foundation of Habermas's argument, a leading critical theorist, lies in the unequal distribution of wealth across society. He states that in an advanced capitalist society, the possibility of a crisis has shifted from the economic and political spheres to the legitimation system. Legitimation crises increase the more government intervenes into the economy (market) and the "simultaneous political enfranchisement of almost the entire adult population" (Holub, 1991, p. 88). The reason for this increase is because policymakers in advanced capitalist democracies are caught between conflicting imperatives: they are expected to serve the interests of their nation as a whole, but they must prop up an economic system that benefits the wealthy at the expense of most workers and the environment. Habermas argues that the driving force in history is an expectation, built into the nature of language, that norms, laws, and institutions will serve the interests of the entire population and not just those of a special group. In his view, policy makers in capitalist societies are having to fend off this expectation by simultaneously correcting some of the inequities of the market, denying that they have control over people's economic circumstances, and defending the market as an equitable allocator of income. (deHaven-Smith, 1988, p. 14). Critical theory suggests that this contradiction will be reflected in Everglades policy by communicative narratives that suppress and conceal tensions between environmental and economic priorities. Habermas’ Legitimation Crisis states that political actors use various symbols, ideologies, narratives, and language to engage the public and avoid a legitimation crisis. These influences not only manipulate the general population into desiring what has been manufactured for them, but also leave them feeling unfulfilled and alienated. Also known as false reconciliation, the public's view of society as rational, and "conductive to human freedom and happiness" is altered to become deeply irrational and an obstacle to the desired freedom and happiness (Finlayson, 2005, p. 5). These obstacles and irrationalities give rise to potential crises in the society. Government's increasing involvement in Everglades under advanced capitalism leads to Habermas's four crises: economic/environmental, rationality, legitimation, and motivation. These crises are occurring simultaneously, work in conjunction with each other, and arise when a principle of organization is challenged by increased production needs (deHaven-Smith, 1988). Habermas states that governments use narratives in an attempt to rationalize, legitimize, obscure, and conceal its actions under advanced capitalism. Although there have been many narratives told throughout the history of the Everglades (such as the Everglades was a wilderness that was valued as a wasteland in its natural state), the most recent narrative, “Everglades Restoration”, is the focus of this paper.(PDF contains 4 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.

Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.

Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on the two-step modified signed-digit (MSD) algorithm, we present a one-step algorithm for the parallel addition and subtraction of two MSD numbers. This algorithm is reached by classifying the three neighboring digit pairs into 10 groups and then making a decision on the groups. It has only a look-up truth table, and can be further formulated by eight computation rules. A joint spatial encoding technique is developed to represent both the input data and the computation rules. Furthermore, an optical correlation architecture is suggested to implement the MSD adder in parallel. An experimental demonstration is also given. (C) 1996 Society of Photo-Optical instrumentation Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optical Coherence Tomography(OCT) is a popular, rapidly growing imaging technique with an increasing number of bio-medical applications due to its noninvasive nature. However, there are three major challenges in understanding and improving an OCT system: (1) Obtaining an OCT image is not easy. It either takes a real medical experiment or requires days of computer simulation. Without much data, it is difficult to study the physical processes underlying OCT imaging of different objects simply because there aren't many imaged objects. (2) Interpretation of an OCT image is also hard. This challenge is more profound than it appears. For instance, it would require a trained expert to tell from an OCT image of human skin whether there is a lesion or not. This is expensive in its own right, but even the expert cannot be sure about the exact size of the lesion or the width of the various skin layers. The take-away message is that analyzing an OCT image even from a high level would usually require a trained expert, and pixel-level interpretation is simply unrealistic. The reason is simple: we have OCT images but not their underlying ground-truth structure, so there is nothing to learn from. (3) The imaging depth of OCT is very limited (millimeter or sub-millimeter on human tissues). While OCT utilizes infrared light for illumination to stay noninvasive, the downside of this is that photons at such long wavelengths can only penetrate a limited depth into the tissue before getting back-scattered. To image a particular region of a tissue, photons first need to reach that region. As a result, OCT signals from deeper regions of the tissue are both weak (since few photons reached there) and distorted (due to multiple scatterings of the contributing photons). This fact alone makes OCT images very hard to interpret.

This thesis addresses the above challenges by successfully developing an advanced Monte Carlo simulation platform which is 10000 times faster than the state-of-the-art simulator in the literature, bringing down the simulation time from 360 hours to a single minute. This powerful simulation tool not only enables us to efficiently generate as many OCT images of objects with arbitrary structure and shape as we want on a common desktop computer, but it also provides us the underlying ground-truth of the simulated images at the same time because we dictate them at the beginning of the simulation. This is one of the key contributions of this thesis. What allows us to build such a powerful simulation tool includes a thorough understanding of the signal formation process, clever implementation of the importance sampling/photon splitting procedure, efficient use of a voxel-based mesh system in determining photon-mesh interception, and a parallel computation of different A-scans that consist a full OCT image, among other programming and mathematical tricks, which will be explained in detail later in the thesis.

Next we aim at the inverse problem: given an OCT image, predict/reconstruct its ground-truth structure on a pixel level. By solving this problem we would be able to interpret an OCT image completely and precisely without the help from a trained expert. It turns out that we can do much better. For simple structures we are able to reconstruct the ground-truth of an OCT image more than 98% correctly, and for more complicated structures (e.g., a multi-layered brain structure) we are looking at 93%. We achieved this through extensive uses of Machine Learning. The success of the Monte Carlo simulation already puts us in a great position by providing us with a great deal of data (effectively unlimited), in the form of (image, truth) pairs. Through a transformation of the high-dimensional response variable, we convert the learning task into a multi-output multi-class classification problem and a multi-output regression problem. We then build a hierarchy architecture of machine learning models (committee of experts) and train different parts of the architecture with specifically designed data sets. In prediction, an unseen OCT image first goes through a classification model to determine its structure (e.g., the number and the types of layers present in the image); then the image is handed to a regression model that is trained specifically for that particular structure to predict the length of the different layers and by doing so reconstruct the ground-truth of the image. We also demonstrate that ideas from Deep Learning can be useful to further improve the performance.

It is worth pointing out that solving the inverse problem automatically improves the imaging depth, since previously the lower half of an OCT image (i.e., greater depth) can be hardly seen but now becomes fully resolved. Interestingly, although OCT signals consisting the lower half of the image are weak, messy, and uninterpretable to human eyes, they still carry enough information which when fed into a well-trained machine learning model spits out precisely the true structure of the object being imaged. This is just another case where Artificial Intelligence (AI) outperforms human. To the best knowledge of the author, this thesis is not only a success but also the first attempt to reconstruct an OCT image at a pixel level. To even give a try on this kind of task, it would require fully annotated OCT images and a lot of them (hundreds or even thousands). This is clearly impossible without a powerful simulation tool like the one developed in this thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 9th International Test Commission Conference (ITC) took place at the Miramar Palace in San Sebastian, Spain, between the 2nd and 5th of July, 2014. The Conference was titled, “Global and Local Challenges for Best Practices in Assessment.” The International Test Commission, ITC (www.intestcom.org), is an association of national psychological associations, test commissions, publishers, and other organizations, as well as individuals who are committed to the promotion of effective testing and assessment policies and to the proper development, evaluation, and uses of educational and psychological instruments. The ITC facilitates the exchange of information among members and stimulates their cooperation on problems related to the construction, distribution, and uses of psychological and educational tests and other psychodiagnostic tools. This volume contains the abstracts of the contributions presented at the 9th International Test Commission Conference. The four themes of the Conference were closely linked to the goals of the ITC: - Challenges and Opportunities in International Assessment. - Application of New Technoloogies and New Psychometric Models in Testing. - Standards and Guidelines for Best Testing Practices. - Testing in Multilingual and Multicultural Contexts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From 1947 to 1973, the U.S.S.R. conducted a huge campaign of illegal whaling worldwide. We review Soviet catches of humpback whales, Megaptera novaeangliae, in the Southern Ocean during this period, with an emphasis on the International Whaling Commission’s Antarctic Management Areas IV, V, and VI (the principal regions of illegal Soviet whaling on this species, south of Australia and western Oceania). Where possible, we summarize legal and illegal Soviet catches by year, Management Area, and factory fleet, and also include information on takes by other nations. Soviet humpback catches between 1947 and 1973 totaled 48,702 and break down as follows: 649 (Area I), 1,412 (Area II), 921 (Area III), 8,779 (Area IV), 22,569 (Area V), and 7,195 (Area VI), with 7,177 catches not currently assignable to area. In all, at least 72,542 humpback whales were killed by all operations (Soviet plus other nations) after World War II in Areas IV (27,201), V (38,146), and VI (7,195). More than one-third of these (25,474 whales, of which 25,192 came from Areas V and VI) were taken in just two seasons, 1959–60 and 1960–61. The impact of these takes, and of those from Area IV in the late 1950’s, is evident in the sometimes dramatic declines in catches at shore stations in Australia, New Zealand, and at Norfolk Island. When compared to recent estimates of abundance and initial population size, the large removals from Areas IV and V indicate that the populations in these regions remain well below pre-exploitation levels despite reported strong growth rates off eastern and western Australia. Populations in many areas of Oceania continue to be small, indicating that the catches from Area VI and eastern Area V had long-term impacts on recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In November 1993, Professor Alexei Yablokov, who at the time was the Science Advisor to Russian President Boris Yeltsin, stood on a podium in Galveston, Tex., and delivered a speech to the Society for Marine Mammalogy’s biennial conference, the premier international event in the field of marine mammal science. Addressing the 1,500 scientists present, he made what amounted to a national confession: that, beginning in 1948, the U.S.S.R. had begun a huge campaign of illegal whaling. Despite being a signatory to the International Convention on the Regulation of Whaling (signed in Washington, D.C., just 2 years before in 1946), the Soviets set out to pillage the world’s ocean

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I have always condemned (and to do anything more was not within our power or abilities) the illegal and sometimes destructive whaling by the Soviet Union. This opinion was expressed in numerous documents, including reports and records of presentations at scientific and other meetings; these documents are the witnesses to this condemnation. However, none of these documents ever saw the light of day: all of them were marked with the sinister stamp “secret.” When necessary in this memoir, my opinion of the whaling will be supported by data drawn from these docume

Relevância:

30.00% 30.00%

Publicador:

Resumo:

John Otterbein Snyder (1867–1943) was an early student of David Starr Jordan at Stanford University and subsequently rose to become an assistant professor there. During his 34 years with the university he taught a wide variety of courses in various branches of zoology and advised numerous students. He eventually mentored 8 M.A. and 4 Ph.D. students to completion at Stanford. He also assisted in the collection of tens of thousands of fish specimens from the western Pacific, central Pacific, and the West Coast of North America, part of the time while stationed as “Naturalist” aboard the U.S. Fish Commission’s Steamer Albatross (1902–06). Although his early publications dealt mainly with fish groups and descriptions (often as a junior author with Jordan), after 1910 he became more autonomous and eventually rose to become one of the Pacific salmon, Oncorhynchus spp., experts on the West Coast. Throughout his career, he was especially esteemed by colleagues as “a stimulating teacher,” “an excellent biologist,” and “a fine man.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

William Francis Thompson (1888–1965), an early fishery biologist, joined the California Fish and Game Commission in 1917 with a mandate to investigate the marine fisheries of the state. He initiated studies on the albacore tuna, Thunnus alalunga, and the Pacific sardine, Sardinops sagax, as well as studies on other economically important marine organisms. Thompson built up a staff of fishery scientists, many of whom later attained considerable renown in their field, and he helped develop, and then direct, the commission’s first marine fisheries laboratory. During his tenure in California, he developed a personal philosophy of research that he outlined in several publications. Thompson based his approach on the yield-based analysis of the fisheries as opposed to large-scale environmental studies. He left the state agency in 1925 to direct the newly formed International Fisheries Commission (now the International Pacific Halibut Commission). William Thompson became a major figure in fisheries research in the United States, and particularly in the Pacific Northwest and Alaska, during the first half of the 20th cent

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spencer Fullerton Baird (Fig. 1), a noted systematic zoologist and builder of scientific institutions in 19th century America, persuaded the U.S. Congress to establish the United States Commission of Fish and Fisheries1 in March 1871. At that time, Baird was Assistant Secretary of the Smithsonian Institution. Following the death of Joseph Henry in 1878, he became head of the institution, a position he held until his own demise in 1887. In addition to his many duties as a Smithsonian official, including his prominent role in developing the Smithsonian’s Federally funded National Museum as the repository for governmental scientific collections, Baird directed the Fish Commission from 1871 until 1887. The Fish Commission’s original mission was to determine the reasons and remedies for the apparent decline of American fisheries off southern New England as well as other parts of the United States. In 1872, Congress further directed the Commission to begin a large fish hatching program aimed at increasing the supply of American food f