4 resultados para Airport zoning.

em Duke University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protecting public health is the most legitimate use of zoning, and yet there is minimal progress in applying it to the obesity problem. Zoning could potentially be used to address both unhealthy and healthy food retailers, but lack of evidence regarding the impact of zoning and public opinion on zoning changes are barriers to implementing zoning restrictions on fast food on a larger scale. My dissertation addresses these gaps in our understanding of health zoning as a policy option for altering built, food environments.

Chapter 1 examines the relationship between food swamps and obesity and whether spatial mapping might be useful in identifying priority geographic areas for zoning interventions. I employ an instrumental variables (IV) strategy to correct for the endogeneity problems associated with food environments, namely that individuals may self-select into certain neighborhoods and may consider food availability in their decision process. I utilize highway exits as a source of exogenous variation .Using secondary data from the USDA Food Environment Atlas, ordinary least squares (OLS) and IV regression models were employed to analyze cross-sectional associations between local food environments and the prevalence of obesity. I find even after controlling for food desert effects, food swamps have a positive, statistically significant effect on adult obesity rates.

Chapter 2 applies theories of message framing and prospect theory to the emerging discussion around health zoning policies targeting food environments and to explore public opinion toward a list of potential zoning restrictions on fast-food restaurants (beyond moratoriums on new establishments). In order to explore causality, I employ an online survey experiment manipulating exposure to vignettes with different message frames about health zoning restrictions with two national samples of adult Americans age 18 and over (N1=2,768 and N2=3,236). The second sample oversamples Black Americans (N=1,000) and individuals with high school as their highest level of education. Respondents were randomly assigned to one of six conditions where they were primed with different message frames about the benefits of zoning restrictions on fast food retailers. Participants were then asked to indicate their support for six zoning policies on a Likert scale. Subjects also answered questions about their food store access, eating behaviors, health status and perceptions of food stores by type.

I find that a message frame about Nutrition and increasing Equity in the food system was particularly effective at increasing support for health zoning policies targeting fast food outlets across policy categories (Conditional, Youth-related, Performance and Incentive) and across racial groups. This finding is consistent with an influential environmental justice scholar’s description of “injustice frames” as effective in mobilizing supporters around environmental issues (Taylor 2000). I extend this rationale to food environment obesity prevention efforts and identify Nutrition combined with Equity frames as an arguably universal campaign strategy for bolstering public support of zoning restrictions on fast food retailers.

Bridging my findings from both Chapters 1 and 2, using food swamps as a spatial metaphor may work to identify priority areas for policy intervention, but only if there is an equitable distribution of resources and mobilization efforts to improve consumer food environments. If the structural forces which ration access to land-use planning persist (arguably including the media as gatekeepers to information and producers of message frames) disparities in obesity are likely to widen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Failing to find a tumor in an x-ray scan or a gun in an airport baggage screening can have dire consequences, making it fundamentally important to elucidate the mechanisms that hinder performance in such visual searches. Recent laboratory work has indicated that low target prevalence can lead to disturbingly high miss rates in visual search. Here, however, we demonstrate that misses in low-prevalence searches can be readily abated. When targets are rarely present, observers adapt by responding more quickly, and miss rates are high. Critically, though, these misses are often due to response-execution errors, not perceptual or identification errors: Observers know a target was present, but just respond too quickly. When provided an opportunity to correct their last response, observers can catch their mistakes. Thus, low target prevalence may not be a generalizable cause of high miss rates in visual search.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Allocating resources optimally is a nontrivial task, especially when multiple

self-interested agents with conflicting goals are involved. This dissertation

uses techniques from game theory to study two classes of such problems:

allocating resources to catch agents that attempt to evade them, and allocating

payments to agents in a team in order to stabilize it. Besides discussing what

allocations are optimal from various game-theoretic perspectives, we also study

how to efficiently compute them, and if no such algorithms are found, what

computational hardness results can be proved.

The first class of problems is inspired by real-world applications such as the

TOEFL iBT test, course final exams, driver's license tests, and airport security

patrols. We call them test games and security games. This dissertation first

studies test games separately, and then proposes a framework of Catcher-Evader

games (CE games) that generalizes both test games and security games. We show

that the optimal test strategy can be efficiently computed for scored test

games, but it is hard to compute for many binary test games. Optimal Stackelberg

strategies are hard to compute for CE games, but we give an empirically

efficient algorithm for computing their Nash equilibria. We also prove that the

Nash equilibria of a CE game are interchangeable.

The second class of problems involves how to split a reward that is collectively

obtained by a team. For example, how should a startup distribute its shares, and

what salary should an enterprise pay to its employees. Several stability-based

solution concepts in cooperative game theory, such as the core, the least core,

and the nucleolus, are well suited to this purpose when the goal is to avoid

coalitions of agents breaking off. We show that some of these solution concepts

can be justified as the most stable payments under noise. Moreover, by adjusting

the noise models (to be arguably more realistic), we obtain new solution

concepts including the partial nucleolus, the multiplicative least core, and the

multiplicative nucleolus. We then study the computational complexity of those

solution concepts under the constraint of superadditivity. Our result is based

on what we call Small-Issues-Large-Team games and it applies to popular

representation schemes such as MC-nets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.