938 resultados para Crash cushions.
Resumo:
This book challenges the assumption that it is bad news when the economy doesn’t grow. For decades, it has been widely recognized that there are ecological limits to continuing economic growth and that different ways of living, working and organizing our economies are urgently required. This urgency has increased since the financial crash of 2007–2008, but mainstream economists and politicians are unable to think differently. The authors of this book demonstrate why our economic system demands ecologically unsustainable growth and the pursuit of more ‘stuff’. They believe that what matters is quality, not quantity – a better life based on having fewer material possessions, less production and less work. Such a way of life will emphasize well‑being, community, security and ‘conviviality’. That is, more real wealth. The book will therefore appeal to everyone curious as to how a new post-growth economics can be conceived and enacted. It will be of particular interest to policy makers, politicians, businesspeople, trade unionists, academics, students, journalists and a wide range of people working in the not-for-profit sector. All of the contributors are leading thinkers on green issues and members of the new think-tank Green House.
Resumo:
Society depends on complex IT systems created by integrating and orchestrating independently managed systems. The incredible increase in scale and complexity in them over the past decade means new software-engineering techniques are needed to help us cope with their inherent complexity. The key characteristic of these systems is that they are assembled from other systems that are independently controlled and managed. While there is increasing awareness in the software engineering community of related issues, the most relevant background work comes from systems engineering. The interacting algos that led to the Flash Crash represent an example of a coalition of systems, serving the purposes of their owners and cooperating only because they have to. The owners of the individual systems were competing finance companies that were often mutually hostile. Each system jealously guarded its own information and could change without consulting any other system.
Resumo:
This paper will argue that the American economy could and will absorb the recent shocks, and that in the longer run (within a matter of years), it will somehow convert the revealed weaknesses to its advantage. America has a long record of learning from its excesses to improve the working of its particular brand of capitalism, dating back to the imposition of antitrust controls on the robber barons in the late 1800s and the enhancement of investor protection after the 1929 crash. The American economy has experienced market imperfections of all kinds but it almost always has found, true, not perfect, but fairly reliable regulatory answers and has managed to adapt to change, (e. g. Dodd-Frank Act on financial stability). The U.S. has many times pioneered in the elaboration of both theoretical and policy oriented solutions for conflicts between markets and government to increase economic welfare (Bernanke, 2008, p. 425). There is no single reason why it should not turn the latest financial calamities to its advantage. At the same time, to regain confidence in capitalism as a global system, global efforts are indispensable. To identify some of the global economic conflicts that have a lot to do with U.S. markets in particular, we seek answers to global systemic questions.
Resumo:
Extreme stock price movements are of great concern to both investors and the entire economy. For investors, a single negative return, or a combination of several smaller returns, can possible wipe out so much capital that the firm or portfolio becomes illiquid or insolvent. If enough investors experience this loss, it could shock the entire economy. An example of such a case is the stock market crash of 1987. Furthermore, there has been a lot of recent interest regarding the increasing volatility of stock prices. ^ This study presents an analysis of extreme stock price movements. The data utilized was the daily returns for the Standard and Poor's 500 index from January 3, 1978 to May 31, 2001. Research questions were analyzed using the statistical models provided by extreme value theory. One of the difficulties in examining stock price data is that there is no consensus regarding the correct shape of the distribution function generating the data. An advantage with extreme value theory is that no detailed knowledge of this distribution function is required to apply the asymptotic theory. We focus on the tail of the distribution. ^ Extreme value theory allows us to estimate a tail index, which we use to derive bounds on the returns for very low probabilities on an excess. Such information is useful in evaluating the volatility of stock prices. There are three possible limit laws for the maximum: Gumbel (thick-tailed), Fréchet (thin-tailed) or Weibull (no tail). Results indicated that extreme returns during the time period studied follow a Fréchet distribution. Thus, this study finds that extreme value analysis is a valuable tool for examining stock price movements and can be more efficient than the usual variance in measuring risk. ^
Resumo:
The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^
Resumo:
Run-off-road (ROR) crashes have increasingly become a serious concern for transportation officials in the State of Florida. These types of crashes have increased proportionally in recent years statewide and have been the focus of the Florida Department of Transportation. The goal of this research was to develop statistical models that can be used to investigate the possible causal relationships between roadway geometric features and ROR crashes on Florida's rural and urban principal arterials. ^ In this research, Zero-Inflated Poisson (ZIP) and Zero-Inflated Negative Binomial (ZINB) Regression models were used to better model the excessive number of roadway segments with no ROR crashes. Since Florida covers a diverse area and since there are sixty-seven counties, it was divided into four geographical regions to minimize possible unobserved heterogeneity. Three years of crash data (2000–2002) encompassing those for principal arterials on the Florida State Highway System were used. Several statistical models based on the ZIP and ZINB regression methods were fitted to predict the expected number of ROR crashes on urban and rural roads for each region. Each region was further divided into urban and rural areas, resulting in a total of eight crash models. A best-fit predictive model was identified for each of these eight models in terms of AIC values. The ZINB regression was found to be appropriate for seven of the eight models and the ZIP regression was found to be more appropriate for the remaining model. To achieve model convergence, some explanatory variables that were not statistically significant were included. Therefore, strong conclusions cannot be derived from some of these models. ^ Given the complex nature of crashes, recommendations for additional research are made. The interaction of weather and human condition would be quite valuable in discerning additional causal relationships for these types of crashes. Additionally, roadside data should be considered and incorporated into future research of ROR crashes. ^
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
Fueled by increasing human appetite for high computing performance, semiconductor technology has now marched into the deep sub-micron era. As transistor size keeps shrinking, more and more transistors are integrated into a single chip. This has increased tremendously the power consumption and heat generation of IC chips. The rapidly growing heat dissipation greatly increases the packaging/cooling costs, and adversely affects the performance and reliability of a computing system. In addition, it also reduces the processor's life span and may even crash the entire computing system. Therefore, dynamic thermal management (DTM) is becoming a critical problem in modern computer system design. Extensive theoretical research has been conducted to study the DTM problem. However, most of them are based on theoretically idealized assumptions or simplified models. While these models and assumptions help to greatly simplify a complex problem and make it theoretically manageable, practical computer systems and applications must deal with many practical factors and details beyond these models or assumptions. The goal of our research was to develop a test platform that can be used to validate theoretical results on DTM under well-controlled conditions, to identify the limitations of existing theoretical results, and also to develop new and practical DTM techniques. This dissertation details the background and our research efforts in this endeavor. Specifically, in our research, we first developed a customized test platform based on an Intel desktop. We then tested a number of related theoretical works and examined their limitations under the practical hardware environment. With these limitations in mind, we developed a new reactive thermal management algorithm for single-core computing systems to optimize the throughput under a peak temperature constraint. We further extended our research to a multicore platform and developed an effective proactive DTM technique for throughput maximization on multicore processor based on task migration and dynamic voltage frequency scaling technique. The significance of our research lies in the fact that our research complements the current extensive theoretical research in dealing with increasingly critical thermal problems and enabling the continuous evolution of high performance computing systems.