951 resultados para Metric Average
Resumo:
2000 Mathematics Subject Classification: 35B40, 35L15.
Resumo:
In this paper a full analytic model for pause intensity (PI), a no-reference metric for video quality assessment, is presented. The model is built upon the video play out buffer behavior at the client side and also encompasses the characteristics of a TCP network. Video streaming via TCP produces impairments in play continuity, which are not typically reflected in current objective metrics such as PSNR and SSIM. Recently the buffer under run frequency/probability has been used to characterize the buffer behavior and as a measurement for performance optimization. But we show, using subjective testing, that under run frequency cannot reflect the viewers' quality of experience for TCP based streaming. We also demonstrate that PI is a comprehensive metric made up of a combination of phenomena observed in the play out buffer. The analytical model in this work is verified with simulations carried out on ns-2, showing that the two results are closely matched. The effectiveness of the PI metric has also been proved by subjective testing on a range of video clips, where PI values exhibit a good correlation with the viewers' opinion scores. © 2012 IEEE.
Reductions of peak-to-average power ratio and optical beat interference in cost-effective OFDMA-PONs
Resumo:
The peak-to-average power ratio (PAPR) and optical beat interference (OBI) effects are examined thoroughly in orthogonal frequency-division multiplexing access (OFDMA)-passive optical networks (PONs) at a signal bit rate up to ∼ 20 Gb/s per channel using cost-effective intensity-modulation and direct-detection (IM/DD). Single-channel OOFDM and upstream multichannel OFDM-PONs are investigated for up to six users. A number of techniques for mitigating the PAPR and OBI effects are presented and evaluated including adaptive-loading algorithms such as bit/power-loading, clipping for PAPR reduction, and thermal detuning (TD) for the OBI suppression. It is shown that the bit-loading algorithm is a very efficient PAPR reduction technique by reducing it at about 1.2 dB over 100 Km of transmission. It is also revealed that the optimum method for suppressing the OBI is the TD + bit-loading. For a targeted BER of 1 × 10-3, the minimum allowed channel spacing is 11 GHz when employing six users. © 2013 Springer Science+Business Media New York.
Resumo:
This research focuses on automatically adapting a search engine size in response to fluctuations in query workload. Deploying a search engine in an Infrastructure as a Service (IaaS) cloud facilitates allocating or deallocating computer resources to or from the engine. Our solution is to contribute an adaptive search engine that will repeatedly re-evaluate its load and, when appropriate, switch over to a dierent number of active processors. We focus on three aspects and break them out into three sub-problems as follows: Continually determining the Number of Processors (CNP), New Grouping Problem (NGP) and Regrouping Order Problem (ROP). CNP means that (in the light of the changes in the query workload in the search engine) there is a problem of determining the ideal number of processors p active at any given time to use in the search engine and we call this problem CNP. NGP happens when changes in the number of processors are determined and it must also be determined which groups of search data will be distributed across the processors. ROP is how to redistribute this data onto processors while keeping the engine responsive and while also minimising the switchover time and the incurred network load. We propose solutions for these sub-problems. For NGP we propose an algorithm for incrementally adjusting the index to t the varying number of virtual machines. For ROP we present an ecient method for redistributing data among processors while keeping the search engine responsive. Regarding the solution for CNP, we propose an algorithm determining the new size of the search engine by re-evaluating its load. We tested the solution performance using a custom-build prototype search engine deployed in the Amazon EC2 cloud. Our experiments show that when we compare our NGP solution with computing the index from scratch, the incremental algorithm speeds up the index computation 2{10 times while maintaining a similar search performance. The chosen redistribution method is 25% to 50% faster than other methods and reduces the network load around by 30%. For CNP we present a deterministic algorithm that shows a good ability to determine a new size of search engine. When combined, these algorithms give an adapting algorithm that is able to adjust the search engine size with a variable workload.
Resumo:
Annual average daily traffic (AADT) is important information for many transportation planning, design, operation, and maintenance activities, as well as for the allocation of highway funds. Many studies have attempted AADT estimation using factor approach, regression analysis, time series, and artificial neural networks. However, these methods are unable to account for spatially variable influence of independent variables on the dependent variable even though it is well known that to many transportation problems, including AADT estimation, spatial context is important. ^ In this study, applications of geographically weighted regression (GWR) methods to estimating AADT were investigated. The GWR based methods considered the influence of correlations among the variables over space and the spatially non-stationarity of the variables. A GWR model allows different relationships between the dependent and independent variables to exist at different points in space. In other words, model parameters vary from location to location and the locally linear regression parameters at a point are affected more by observations near that point than observations further away. ^ The study area was Broward County, Florida. Broward County lies on the Atlantic coast between Palm Beach and Miami-Dade counties. In this study, a total of 67 variables were considered as potential AADT predictors, and six variables (lanes, speed, regional accessibility, direct access, density of roadway length, and density of seasonal household) were selected to develop the models. ^ To investigate the predictive powers of various AADT predictors over the space, the statistics including local r-square, local parameter estimates, and local errors were examined and mapped. The local variations in relationships among parameters were investigated, measured, and mapped to assess the usefulness of GWR methods. ^ The results indicated that the GWR models were able to better explain the variation in the data and to predict AADT with smaller errors than the ordinary linear regression models for the same dataset. Additionally, GWR was able to model the spatial non-stationarity in the data, i.e., the spatially varying relationship between AADT and predictors, which cannot be modeled in ordinary linear regression. ^
Resumo:
As congestion management strategies begin to put more emphasis on person trips than vehicle trips, the need for vehicle occupancy data has become more critical. The traditional methods of collecting these data include the roadside windshield method and the carousel method. These methods are labor-intensive and expensive. An alternative to these traditional methods is to make use of the vehicle occupancy information in traffic accident records. This method is cost effective and may provide better spatial and temporal coverage than the traditional methods. However, this method is subject to potential biases resulting from under- and over-involvement of certain population sectors and certain types of accidents in traffic accident records. In this dissertation, three such potential biases, i.e., accident severity, driver’s age, and driver’s gender, were investigated and the corresponding bias factors were developed as needed. The results show that although multi-occupant vehicles are involved in higher percentages of severe accidents than are single-occupant vehicles, multi-occupant vehicles in the whole accident vehicle population were not overrepresented in the accident database. On the other hand, a significant difference was found between the distributions of the ages and genders of drivers involved in accidents and those of the general driving population. An information system that incorporates adjustments for the potential biases was developed to estimate the average vehicle occupancies (AVOs) for different types of roadways on the Florida state roadway system. A reasonableness check of the results from the system shows AVO estimates that are highly consistent with expectations. In addition, comparisons of AVOs from accident data with the field estimates show that the two data sources produce relatively consistent results. While accident records can be used to obtain the historical AVO trends and field data can be used to estimate the current AVOs, no known methods have been developed to project future AVOs. Four regression models for the purpose of predicting weekday AVOs on different levels of geographic areas and roadway types were developed as part of this dissertation. The models show that such socioeconomic factors as income, vehicle ownership, and employment have a significant impact on AVOs.
Resumo:
The attempts at carrying out terrorist attacks have become more prevalent. As a result, an increasing number of countries have become particularly vigilant against the means by which terrorists raise funds to finance their draconian acts against human life and property. Among the many counter-terrorism agencies in operation, governments have set up financial intelligence units (FIUs) within their borders for the purpose of tracking down terrorists’ funds. By investigating reported suspicious transactions, FIUs attempt to weed out financial criminals who use these illegal funds to finance terrorist activity. The prominent role played by FIUs means that their performance is always under the spotlight. By interviewing experts and conducting surveys of those associated with the fight against financial crime, this study investigated perceptions of FIU performance on a comparative basis between American and non-American FIUs. The target group of experts included financial institution personnel, civilian agents, law enforcement personnel, academicians, and consultants. Questions for the interview and surveys were based on the Kaplan and Norton’s Balanced Scorecard (BSC) methodology. One of the objectives of this study was to help determine the suitability of the BSC to this arena. While FIUs in this study have concentrated on performance by measuring outputs such as the number of suspicious transaction reports investigated, this study calls for a focus on outcomes involving all the parties responsible for financial criminal investigations. It is only through such an integrated approach that these various entities will be able to improve performance in solving financial crime. Experts in financial intelligence strongly believed that the quality and timeliness of intelligence was more important than keeping track of the number of suspicious transaction reports. Finally, this study concluded that the BSC could be appropriately applied to the arena of financial crime prevention even though the emphasis is markedly different from that in the private sector. While priority in the private sector is given to financial outcomes, in this arena employee growth and internal processes were perceived as most important in achieving a satisfactory outcome.
Resumo:
Global connectivity, for anyone, at anyplace, at anytime, to provide high-speed, high-quality, and reliable communication channels for mobile devices, is now becoming a reality. The credit mainly goes to the recent technological advances in wireless communications comprised of a wide range of technologies, services, and applications to fulfill the particular needs of end-users in different deployment scenarios (Wi-Fi, WiMAX, and 3G/4G cellular systems). In such a heterogeneous wireless environment, one of the key ingredients to provide efficient ubiquitous computing with guaranteed quality and continuity of service is the design of intelligent handoff algorithms. Traditional single-metric handoff decision algorithms, such as Received Signal Strength (RSS) based, are not efficient and intelligent enough to minimize the number of unnecessary handoffs, decision delays, and call-dropping and/or blocking probabilities. This research presented a novel approach for the design and implementation of a multi-criteria vertical handoff algorithm for heterogeneous wireless networks. Several parallel Fuzzy Logic Controllers were utilized in combination with different types of ranking algorithms and metric weighting schemes to implement two major modules: the first module estimated the necessity of handoff, and the other module was developed to select the best network as the target of handoff. Simulations based on different traffic classes, utilizing various types of wireless networks were carried out by implementing a wireless test-bed inspired by the concept of Rudimentary Network Emulator (RUNE). Simulation results indicated that the proposed scheme provided better performance in terms of minimizing the unnecessary handoffs, call dropping, and call blocking and handoff blocking probabilities. When subjected to Conversational traffic and compared against the RSS-based reference algorithm, the proposed scheme, utilizing the FTOPSIS ranking algorithm, was able to reduce the average outage probability of MSs moving with high speeds by 17%, new call blocking probability by 22%, the handoff blocking probability by 16%, and the average handoff rate by 40%. The significant reduction in the resulted handoff rate provides MS with efficient power consumption, and more available battery life. These percentages indicated a higher probability of guaranteed session continuity and quality of the currently utilized service, resulting in higher user satisfaction levels.
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
This case study explores intervention strategies for social capital improvement of ninth grade students so that they can gain a grade point average perspective.
Resumo:
The effects of shade on benthic calcareous periphyton were tested in a short-hydroperiod oligotrophic subtropical wetland (freshwater Everglades). The experiment was a split-plot design set in three sites with similar environmental characteristics. At each site, eight randomly selected 1-m2 areas were isolated individually in a shade house, which did not spectrally change the incident irradiance but reduced it quantitatively by 0, 30, 50, 60, 70, 80, 90 and 98%. Periphyton mat was sampled monthly under each shade house for a 5 month period while the wetland was flooded. Periphyton was analyzed for thickness, DW, AFDW, chlorophyll a (chl a) and incubated in light and dark BOD bottles at five different irradiances to assess its photosynthesis–irradiance (PI) curve and respiration. The PI curves parameters P max, I k and eventually the photoinhibition slope (β) were determined following non-linear regression analyses. Taxonomic composition and total algal biovolume were determined at the end of the experiment. The periphyton composition did not change with shade but the PI curves were significantly affected by it. I k increased linearly with increasing percent irradiance transmittance (%IT = 1−%shade). P max could be fitted with a PI curve equation as it increased with %IT and leveled off after 10%IT. For each shade level, the PI curve was used to integrate daily photosynthesis for a day of average irradiance. The daily photosynthesis followed a PI curve equation with the same characteristics as P max vs. %IT. Thus, periphyton exhibited a high irradiance plasticity under 0–80% shade but could not keep up the same photosynthetic level at higher shade, causing a decrease in daily GPP at 98% shade levels. The plasticity was linked to an increase in the chl a content per cell in the 60–80% shade, while this increase was not observed at lower shade likely because it was too demanding energetically. Thus, chl a is not a good metric for periphyton biomass assessment across variously shaded habitats. It is also hypothesized that irradiance plasticity is linked to photosynthetic coupling between differently comprised algal layers arranged vertically within periphyton mats that have different PI curves.
Resumo:
The growth of criminal gangs and organized crime groups has created unprecedented challenges in Central America. Homicide rates are among the highest in the world, countries spend on average close to 10 percent of GDP to respond to the challenges of public insecurity, and the security forces are frequently overwhelmed and at times coopted by the criminal groups they are increasingly tasked to counter. With some 90 percent of the 700 metric tons of cocaine trafficked from South America to the United States passing through Central America, the lure of aiding illegal traffickers through provision of arms, intelligence, or simply withholding or delaying the use of force is enormous. These conditions raise the question: to what extent are militaries in Central America compromised by illicit ties to criminal groups? The study focuses on three cases: Nicaragua, El Salvador, and Honduras. It finds that: Although illicit ties between the military and criminal groups have grown in the last decade, militaries in these countries are not yet “lost’ to criminal groups. Supplying criminal groups with light arms from military stocks is typical and on the rise, but still not common. In general the less exposed services, the navies and air forces, are the most reliable and effective ones in their roles in interdiction. Of the three countries in the study, the Honduran military is the most worrying because it is embedded in a context where civilian corruption is extremely common, state institutions are notoriously weak, and the political system remains polarized and lacks the popular legitimacy and political will needed to make necessary reforms. Overall, the armed forces in the three countries remain less compromised than civilian peers, particularly the police. However, in the worsening crime and insecurity context, there is a limited window of opportunity in which to introduce measures targeted toward the military, and such efforts can only succeed if opportunities for corruption in other sectors of the state, in particular in law enforcement and the justice system, are also addressed. Measures targeted toward the military should include: Enhanced material benefits and professional education opportunities that open doors for soldiers in promising legitimate careers once they leave military service. A clear system of rewards and punishments specifically designed to deter collusion with criminal groups. More effective securing of military arsenals. Skills and external oversight leveraged through combined operations, to build cooperation among those sectors of the military that have successful and clean records in countering criminal groups, and to expose weaker forces to effective best practices.
Resumo:
The ability to listen and comprehend the intrinsic meaning behind the words people are saying is an important skill for those in the hospitality industry. The author provides some prescriptions for "winning friends and influencing people.”
Resumo:
In their dialogue - An Analysis of Stock Market Performance: The Dow Jones Industrial Average and the Three Top Performing Lodging Firms 1982 – 1988 - by N. H. Ringstrom, Professor and Elisa S. Moncarz, Associate Professor, School of Hospitality Management at Florida International University, Professors Ringstrom and Moncarz state at the outset: “An interesting comparison can be made between the Dow Jones lndustrial Average and the three top performing, publicly held lodging firms which had $100 million or more in annual lodging revenues. The authors provide that analytical comparison with Prime Motor Inns Inc., the Marriott Corporation, and Hilton Hotels Corporation.” “Based on a criterion of size, only those with $100 million in annual lodging revenues or more resulted in the inclusion of the following six major hotel firms: Prime Motor Inns, Inc., Marriott Corporation, Hilton Hotels Corporation, Ramada Inc., Holiday Corporation and La Quinta Motor Inns, Inc.,” say Professors Ringstrom and Moncarz in framing this discussion with its underpinnings in the years 1982 to 1988. The article looks at each company’s fiscal and Dow Jones performance for the years in question, and presents a detailed analysis of said performance. Graphic analysis is included. It helps to have a fairly vigorous knowledge of stock market and fiscal examination criteria to digest this material. The Ringstrom and Moncarz analysis of Prime Motor Inns Incorporated occupies the first 7 pages of this article in and of itself. Marriot Corporation also occupies a prominent position in this discussion. “Marriott, a giant in the hospitality industry, is huge and continuing to grow. Its 1987 sales were more than $6.5 billion, and its employees numbered over 200,000 individuals, which place Marriott among the 10 largest private employers in the country,” Ringstrom and Moncarz parse Marriott’s influence as a significant financial player. “The firm has a fantastic history of growth over the past 60 years, starting in May 1927 with a nine-seat A & W Root Beer stand in Washington, D.C.,” offer the authors in initialing Marriot’s portion of the discussion with a brief history lesson. The Marriot firm was officially incorporated as Hot Shoppes Inc. in 1929. As the thesis statement for the discussion suggests the performance of these huge, hospitality giants is compared and contrasted directly to the Dow Jones Industrial Average performance. Reasons and empirical data are offered by the authors to explain the distinctions. It would be difficult to explain those distinctions without delving deeply into corporate financial history and the authors willingly do so in an effort to help you understand the growth, as well as some of the setbacks of these hospitality based juggernauts. Ringstrom and Moncarz conclude the article with an extensive overview and analysis of the Hilton Hotels Corporation performance for the period outlined. It may well be the most fiscally dynamic of the firms presented for your perusal. “It is interesting to note that Hilton Hotels Corporation maintained a very strong financial position with relatively little debt during the years 1982-1988…the highest among all companies in the study,” the authors paint.
Resumo:
Orthogonal Frequency-Division Multiplexing (OFDM) has been proved to be a promising technology that enables the transmission of higher data rate. Multicarrier Code-Division Multiple Access (MC-CDMA) is a transmission technique which combines the advantages of both OFDM and Code-Division Multiplexing Access (CDMA), so as to allow high transmission rates over severe time-dispersive multi-path channels without the need of a complex receiver implementation. Also MC-CDMA exploits frequency diversity via the different subcarriers, and therefore allows the high code rates systems to achieve good Bit Error Rate (BER) performances. Furthermore, the spreading in the frequency domain makes the time synchronization requirement much lower than traditional direct sequence CDMA schemes. There are still some problems when we use MC-CDMA. One is the high Peak-to-Average Power Ratio (PAPR) of the transmit signal. High PAPR leads to nonlinear distortion of the amplifier and results in inter-carrier self-interference plus out-of-band radiation. On the other hand, suppressing the Multiple Access Interference (MAI) is another crucial problem in the MC-CDMA system. Imperfect cross-correlation characteristics of the spreading codes and the multipath fading destroy the orthogonality among the users, and then cause MAI, which produces serious BER degradation in the system. Moreover, in uplink system the received signals at a base station are always asynchronous. This also destroys the orthogonality among the users, and hence, generates MAI which degrades the system performance. Besides those two problems, the interference should always be considered seriously for any communication system. In this dissertation, we design a novel MC-CDMA system, which has low PAPR and mitigated MAI. The new Semi-blind channel estimation and multi-user data detection based on Parallel Interference Cancellation (PIC) have been applied in the system. The Low Density Parity Codes (LDPC) has also been introduced into the system to improve the performance. Different interference models are analyzed in multi-carrier communication systems and then the effective interference suppression for MC-CDMA systems is employed in this dissertation. The experimental results indicate that our system not only significantly reduces the PAPR and MAI but also effectively suppresses the outside interference with low complexity. Finally, we present a practical cognitive application of the proposed system over the software defined radio platform.