914 resultados para Local Variation Method
Resumo:
Freeway systems are becoming more congested each day. One contribution to freeway traffic congestion comprises platoons of on-ramp traffic merging into freeway mainlines. As a relatively low-cost countermeasure to the problem, ramp meters are being deployed in both directions of an 11-mile section of I-95 in Miami-Dade County, Florida. The local Fuzzy Logic (FL) ramp metering algorithm implemented in Seattle, Washington, has been selected for deployment. The FL ramp metering algorithm is powered by the Fuzzy Logic Controller (FLC). The FLC depends on a series of parameters that can significantly alter the behavior of the controller, thus affecting the performance of ramp meters. However, the most suitable values for these parameters are often difficult to determine, as they vary with current traffic conditions. Thus, for optimum performance, the parameter values must be fine-tuned. This research presents a new method of fine tuning the FLC parameters using Particle Swarm Optimization (PSO). PSO attempts to optimize several important parameters of the FLC. The objective function of the optimization model incorporates the METANET macroscopic traffic flow model to minimize delay time, subject to the constraints of reasonable ranges of ramp metering rates and FLC parameters. To further improve the performance, a short-term traffic forecasting module using a discrete Kalman filter was incorporated to predict the downstream freeway mainline occupancy. This helps to detect the presence of downstream bottlenecks. The CORSIM microscopic simulation model was selected as the platform to evaluate the performance of the proposed PSO tuning strategy. The ramp-metering algorithm incorporating the tuning strategy was implemented using CORSIM's run-time extension (RTE) and was tested on the aforementioned I-95 corridor. The performance of the FLC with PSO tuning was compared with the performance of the existing FLC without PSO tuning. The results show that the FLC with PSO tuning outperforms the existing FL metering, fixed-time metering, and existing conditions without metering in terms of total travel time savings, average speed, and system-wide throughput.
Resumo:
This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.
Resumo:
The elemental (C, N, and P) and isotope (δ13C, δ15N) content of leaves of the seagrasses Thalassia testudinum, Halodule wrightii, and Syringodium filiforme were measured across a 10 000 km2 survey of the seagrass communities of South Florida, USA, in 1999 and 2000. Trends at local and broad spatial scales were compared to examine interspecific variation in the seagrass characteristics often used as ecological indicators. The elemental and stable isotope contents of all species were variable and demonstrated marked interspecific variation. At broad spatial scales, mean N:P ratios were lowest for T. testudinum (36.5 ± 1.1) and S. filiforme (38.9 ± 1.3), and highest for H. wrightii (44.1 ± 1.8). Stable carbon isotope ratios (δ13C) were highest for S. filiforme (–6.2 ± 0.2‰), intermediate for T. testudinum (–8.6 ± 0.2‰), and lowest for H. wrightii (–10.6 ± 0.3‰). Stable nitrogen isotopes (δ15N) were heaviest for T. testudinum (2.0 ± 0.1‰), and lightest for H. wrightii (1.0 ± 0.3‰) and S. filiforme (1.6 ± 0.2‰). Site depth was negatively correlated to δ13C for all species, while δ15N was positively correlated to depth for H. wrightii and S. filiforme. Similar trends were observed in local comparisons, suggesting that taxon-specific physiological/ecological properties strongly control interspecific variation in elemental and stable isotope content. Temporal trends in δ13C were measured, and revealed that interspecific variation was displayed throughout the year. This work documents interspecific variation in the nutrient dynamics of 3 common seagrasses in South Florida, indicating that interpretation of elemental and stable isotope values needs to be species specific.
Resumo:
The abundance of calcareous green algae was recorded quarterly at 28 sites within the Florida Keys National Marine Sanctuary (FKNMS) for a period of 7 years as part of a sea grass monitoring program. To evaluate the validity of using the functional-form group approach, we designed a sampling method that included the functional-form group and the component genera. This strategy enabled us to analyze the spatiotemporal patterns in the abundance of calcareous green algae as a group and to describe synchronous behavior among its genera through the application of a nonlinear regression model to both categories of data. Spatial analyses revealed that, in general, all genera displayed long-term trends of increasing abundance at most sites; however, at some sites the long-term trends for genera opposed one another. Strong synchrony in the timing of seasonal changes was found among all genera, possibly reflecting similar reproductive and seasonal growth pattern, but the variability in the magnitude of seasonal changes was very high among genera and sites. No spatial patterns were found in long-term or seasonal changes; the only significant relation detected was for slope, with sites closer to land showing higher values, suggesting that some factors associated with land proximity are affecting this increase. We conclude that the abundances of genera behaved differently from the functional-form group, indicating that the use of the functionalform group approach may be unsuitable to detect changes in sea grass community structure in the FKNMS at the existing temporal and spatial scale of the monitoring program.
Resumo:
Annual Average Daily Traffic (AADT) is a critical input to many transportation analyses. By definition, AADT is the average 24-hour volume at a highway location over a full year. Traditionally, AADT is estimated using a mix of permanent and temporary traffic counts. Because field collection of traffic counts is expensive, it is usually done for only the major roads, thus leaving most of the local roads without any AADT information. However, AADTs are needed for local roads for many applications. For example, AADTs are used by state Departments of Transportation (DOTs) to calculate the crash rates of all local roads in order to identify the top five percent of hazardous locations for annual reporting to the U.S. DOT. ^ This dissertation develops a new method for estimating AADTs for local roads using travel demand modeling. A major component of the new method involves a parcel-level trip generation model that estimates the trips generated by each parcel. The model uses the tax parcel data together with the trip generation rates and equations provided by the ITE Trip Generation Report. The generated trips are then distributed to existing traffic count sites using a parcel-level trip distribution gravity model. The all-or-nothing assignment method is then used to assign the trips onto the roadway network to estimate the final AADTs. The entire process was implemented in the Cube demand modeling system with extensive spatial data processing using ArcGIS. ^ To evaluate the performance of the new method, data from several study areas in Broward County in Florida were used. The estimated AADTs were compared with those from two existing methods using actual traffic counts as the ground truths. The results show that the new method performs better than both existing methods. One limitation with the new method is that it relies on Cube which limits the number of zones to 32,000. Accordingly, a study area exceeding this limit must be partitioned into smaller areas. Because AADT estimates for roads near the boundary areas were found to be less accurate, further research could examine the best way to partition a study area to minimize the impact.^
Resumo:
Variation and uncertainty in estimated evaporation was determined over time and between two locations in Florida Bay, a subtropical estuary. Meteorological data were collected from September 2001 to August 2002 at Rabbit Key and Butternut Key within the Bay. Evaporation was estimated using both vapor flux and energy budget methods. The results were placed into a long-term context using 33 years of temperature and rainfall data collected in south Florida. Evaporation also was estimated from this long-term data using an empirical formula relating evaporation to clear sky solar radiation and air temperature. Evaporation estimates for the 12-mo period ranged from 144 to 175 cm yr21, depending on location and method, with an average of 163 cm yr21 (6 9%). Monthly values ranged from 9.2 to 18.5 cm, with the highest value observed in May, corresponding with the maximum in measured net radiation. Uncertainty estimates derived from measurement errors in the data were as much as 10%, and were large enough to obscure differences in evaporation between the two sites. Differences among all estimates for any month indicate the overall uncertainty in monthly evaporation, and ranged from 9% to 26%. Over a 33-yr period (1970–2002), estimated annual evaporation from Florida Bay ranged from 148 to 181 cm yr21, with an average of 166 cm yr21. Rainfall was consistently lower in Florida Bay than evaporation, with a long-term average of 106 cm yr21. Rainfall considered alone was uncorrelated with evaporation at both monthly and annual time scales; when the seasonal variation in clear sky radiation was also taken into account both net radiation and evaporation were significantly suppressed in months with high rainfall.
Resumo:
Precipitation data collected from five sites in south Florida indicate a strong seasonal and spatial variation in δ18O and δD, despite the relatively limited geographic coverage and low-lying elevation of each of the collection sites. Based upon the weighted-mean stable isotope values, the sites were classified as coastal Atlantic, inland, and lower Florida Keys. The coastal Atlantic sites had weighted-mean values of δ18O and δD of −2.86‰ and −12.8‰, respectively, and exhibited a seasonal variation with lower δ18O and δD values in the summer wet-season precipitation (δ18O = −3.38‰, δD = −16.5‰) as compared to the winter-time precipitation (δ18O = −1.66‰, δD = −3.2‰). The inland site was characterized as having the highest d-excess value (+13.3‰), signifying a contribution of evaporated Everglades surface water to the local atmospheric moisture. In spite of its lower latitude, the lower Keys site located at Long Key had the lowest weighted-mean stable isotope values (δ18O = −3.64‰, δD = −20.2‰) as well as the lowest d-excess value of (+8.8‰). The lower δD and δ18O values observed at the Long Key site reflect the combined effects of oceanic vapor source, fractionation due to local precipitation, and slower equilibration of the larger raindrops nucleated by a maritime aerosol. Very low δ18O and δD values (δ18O < −6‰, δD < −40‰) were observed just prior to the passage of hurricanes from the Gulf of Mexico as well as during cold fronts from the north-west. These results suggest that an oceanic vapor source region to the west, may be responsible for the extremely low δD and δ18O values observed during some tropical storms and cold fronts.
Resumo:
This study examines the performance of series of two geomagnetic indices and series synthesized from a semi-empirical model of magnetospheric currents, in explaining the geomagnetic activity observed at Northern Hemipshere's mid-latitude ground-based stations. We analyse data, for the 2007 to 2014 period, from four magnetic observatories (Coimbra, Portugal; Panagyurishte, Bulgary; Novosibirsk, Russia and Boulder, USA), at geomagnetic latitudes between 40° and 50° N. The quiet daily (QD) variation is firstly removed from the time series of the geomagnetic horizontal component (H) using natural orthogonal components (NOC) tools. We compare the resulting series with series of storm-time disturbance (Dst) and ring current (RC) indices and with H series synthesized from the Tsyganenko and Sitnov (2005, doi:10.1029/2004JA010798) (TS05) semi-empirical model of storm-time geomagnetic field. In the analysis, we separate days with low and high local K-index values. Our results show that NOC models are as efficient as standard models of QD variation in preparing raw data to be compared with proxies, but with much less complexity. For the two stations in Europe, we obtain indication that NOC models could be able to separate ionospheric and magnetospheric contributions. Dst and RC series explain the four observatory H-series successfully, with values for the mean of significant correlation coefficients, from 0.5 to 0.6 during low geomagnetic activity (K less than 4) and from 0.6 to 0.7 for geomagnetic active days (K greater than or equal to 4). With regard to the performance of TS05, our results show that the four observatories separate into two groups: Coimbra and Panagyurishte, in one group, for which the magnetospheric/ionospheric ratio in QD variation is smaller, a dominantly QD ionospheric contribution can be removed and TS05 simulations are the best proxy; Boulder and Novosibirsk,in the other group, for which the ionospheric and magnetospheric contributions in QD variation can not be differentiated and correlations with TS05 series can not be made to improve. The main contributor to magnetospheric QD signal are Birkeland currents. The relatively good success of TS05 model in explaining ground-based irregular geomagnetic activity at mid-latitudes makes it an effective tool to classify storms according to their main sources. For Coimbra and Panagyurishte in particular, where ionospheric and magnetospheric daily contributions seem easier to separate, we can aspire to use the TS05 model for ensemble generation in space weather (SW) forecasting and interpretation of past SW events.
Resumo:
An emerging approach to downscaling the projections from General Circulation Models (GCMs) to scales relevant for basin hydrology is to use output of GCMs to force higher-resolution Regional Climate Models (RCMs). With spatial resolution often in the tens of kilometers, however, even RCM output will likely fail to resolve local topography that may be climatically significant in high-relief basins. Here we develop and apply an approach for downscaling RCM output using local topographic lapse rates (empirically-estimated spatially and seasonally variable changes in climate variables with elevation). We calculate monthly local topographic lapse rates from the 800-m Parameter-elevation Regressions on Independent Slopes Model (PRISM) dataset, which is based on regressions of observed climate against topographic variables. We then use these lapse rates to elevationally correct two sources of regional climate-model output: (1) the North American Regional Reanalysis (NARR), a retrospective dataset produced from a regional forecasting model constrained by observations, and (2) a range of baseline climate scenarios from the North American Regional Climate Change Assessment Program (NARCCAP), which is produced by a series of RCMs driven by GCMs. By running a calibrated and validated hydrologic model, the Soil and Water Assessment Tool (SWAT), using observed station data and elevationally-adjusted NARR and NARCCAP output, we are able to estimate the sensitivity of hydrologic modeling to the source of the input climate data. Topographic correction of regional climate-model data is a promising method for modeling the hydrology of mountainous basins for which no weather station datasets are available or for simulating hydrology under past or future climates.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
On the petroleum industry, the State developed the Local Content police as a regulatory action to guarantee the preference of the national supply industry. Observing that, this paper will analyze the Local Content police aware of the constitutional goal of development as wright in the Constituição Federal de 1988. For it, will be used the hypothetical-deductive method for identifying the Local Content police as State strategy of development turn it in the object of critics in a dialectic way of thinking to in the final, present a conclusion about the police. As result was saw that the existent structure of the police at Brazil is inefficient, claiming for a rebuilt. For conclusion, is said that because of the inadequate construction of the Local Content police created inside of the Agência Nacional do Petróleo – ANP, the efficiency of the full potential of the police is been stopped, something that can be only corrected although a re-make of the police
Resumo:
Petroleum exploration activity occurs on the offshore Potiguar Basin, from very shallow (2-3 m) until about 50 m water depth, extending from Alto de Touros (RN) to Alto de Fortaleza (CE). Take in account the biological importance and the heterogeneity of sediments on this area, it is necessary the understanding of the sedimentological dynamics, and mainly the changes generated by petroleum exploration to prevent possible damages to environment. Despite the intense activity of oil exploration in this area, research projects like these are still rare. In view to minimize this gap, this study was developed to evaluate sedimentological, mineralogical and geochemical changes in the vicinity of a exploration well, here designated as well A, located on the Middle continental shelf, near the transition to Outer shelf. The well selected for this study was the first one drilled with Riserless Mud Recovery technology (RMR) in Brazil. The main difference from this to the conventional method is the possibility of drilling phase I of the well with return of drilling material to the rig tank, minimizing fluid and gravel discharging around the vicinity, during this phase. Monitoring consisted of three surveys, first of them done before start drilling, the second one done 19 days after the end of drilling and the third one done one year after then. Comparison of the studied variables (calcium carbonate and organic matter content, sediment size, mineralogy and geochemistry) was done with their average, median and coefficient of variation values to understand the changes after drilling activity. Because operating company technical reasons, the well location was changed after the first survey (C1), resulting in a shift of the sampled area on the two last surveys (C2 e C3). Nevertheless, the acquired data presented a good correlation, with no loss to the mean goal of the study. The sedimentological, mineralogical and geochemical analyzes were done at Federal University of Rio Grande do Norte (UFRN). The results indicated a predominantly sandy environment along the three surveys. It was noticed that the first survey (C1), presented different values for all the studied variables than to the second (C2) and third (C3) surveys, which had similar values. Siliciclastic sediments are prevalent at all surveys, and quartz is the main component (more than 80%). Heavy minerals (garnet, turmaline, zircon and lmenite), rock fragments and mud aggregates also was described. Bioclastic sediments are dominated by coralline algae (more than 45%) and mollusks (more than 30%), followed by benthic foraminifera, bryozoans and worm tubes. More rarely was observed ostracoda and spike of calcareous sponge. Because the low changes of the sediments at the studied area and by the using of RMR method in the drilling, it was possible to conclude that drilling activity did not promote significant alteration on the local sediment cover. Changes in the studied variables before and after drilling activity could be influenced by the changing in the sampling area after survey 1 (C1).
Resumo:
In contemporary dynamics, a change is observed in the institutional structure of the state, culminating in several policies for the tourist sector which promote a new management format. The from this view, the Tourism Regionalization Macro Program (TRP), considered a significant program to Ministry of Tourism, arose as an answer to this new reality, having as strategy a joint working of structuring and promotion turned at decentralization of actions, valuing the residents participation in the search of the permanent dialogue between peers and revaluation of places and territories, based in the regionalization process. Based on this bias, this study aims to examine the role of the Tourism State Council of Rio Grande do Norte, with regard to the tourism planning, trying to understand it and solve it as governance Instance, through the Tourism Regionalization Program interventions, given the participation context of its actors and agents. For purposes of this study is delimited as time frame the year 2007 at 2014, understanding that it was this time, there was greater council members accession, as well as different types of sectors representation of civil society, as a result of a tourism public policy based on principles of innovation and participation. In relation to the research problem, this study is conceptualized as a qualitative and the chosen method is the materialist dialectic. Still on the methodological options, utilize the Content Analysis. The results show that the institutionalization of governance instance as the Conetur does not contributes, ideally, in the planning and management process of participatory and integrated tourist activity, facing a fair direction of your space production. The research indicates that there are debates, discussions and guidelines (still in a timely and targeted form), but not reverberates practical effects, by act in a conjuncture that Is strategically designed for political and economic power game, setting the hegemonic actors performance, which uses this arena to instill personal desires and wishes, that are decided in absentia to the council.
Resumo:
This thesis begins by studying the thickness of evaporative spin coated colloidal crystals and demonstrates the variation of the thickness as a function of suspension concentration and spin rate. Particularly, the films are thicker with higher suspension concentration and lower spin rate. This study also provides evidence for the reproducibility of spin coating in terms of the thickness of the resulting colloidal films. These colloidal films, as well as the ones obtained from various other methods such as convective assembly and dip coating, usually possess a crystalline structure. Due to the lack of a comprehensive method for characterization of order in colloidal structures, a procedure is developed for such a characterization in terms of local and longer range translational and orientational order. Translational measures turn out to be adequate for characterizing small deviations from perfect order, while orientational measures are more informative for polycrystalline and highly disordered crystals. Finally, to obtain an understanding of the relationship between dynamics and structure, the dynamics of colloids in a quasi-2D suspension as a function of packing fraction is studied. The tools that are used are mean square displacement (MSD) and the self part of the van Hove function. The slow down of dynamics is observed as the packing fraction increases, accompanied with the emergence of 6-fold symmetry within the system. The dynamics turns out to be non-Gaussian at early times and Gaussian at later times for packing fractions below 0.6. Above this packing fraction, the dynamics is non-Gaussian at all times. Also the diffusion coefficient is calculated from MSD and the van Hove function. It goes down as the packing fraction is increased.
Resumo:
Improvements in genomic technology, both in the increased speed and reduced cost of sequencing, have expanded the appreciation of the abundance of human genetic variation. However the sheer amount of variation, as well as the varying type and genomic content of variation, poses a challenge in understanding the clinical consequence of a single mutation. This work uses several methodologies to interpret the observed variation in the human genome, and presents novel strategies for the prediction of allele pathogenicity.
Using the zebrafish model system as an in vivo assay of allele function, we identified a novel driver of Bardet-Biedl Syndrome (BBS) in CEP76. A combination of targeted sequencing of 785 cilia-associated genes in a cohort of BBS patients and subsequent in vivo functional assays recapitulating the human phenotype gave strong evidence for the role of CEP76 mutations in the pathology of an affected family. This portion of the work demonstrated the necessity of functional testing in validating disease-associated mutations, and added to the catalogue of known BBS disease genes.
Further study into the role of copy-number variations (CNVs) in a cohort of BBS patients showed the significant contribution of CNVs to disease pathology. Using high-density array comparative genomic hybridization (aCGH) we were able to identify pathogenic CNVs as small as several hundred bp. Dissection of constituent gene and in vivo experiments investigating epistatic interactions between affected genes allowed for an appreciation of several paradigms by which CNVs can contribute to disease. This study revealed that the contribution of CNVs to disease in BBS patients is much higher than previously expected, and demonstrated the necessity of consideration of CNV contribution in future (and retrospective) investigations of human genetic disease.
Finally, we used a combination of comparative genomics and in vivo complementation assays to identify second-site compensatory modification of pathogenic alleles. These pathogenic alleles, which are found compensated in other species (termed compensated pathogenic deviations [CPDs]), represent a significant fraction (from 3 – 10%) of human disease-associated alleles. In silico pathogenicity prediction algorithms, a valuable method of allele prioritization, often misrepresent these alleles as benign, leading to omission of possibly informative variants in studies of human genetic disease. We created a mathematical model that was able to predict CPDs and putative compensatory sites, and functionally showed in vivo that second-site mutation can mitigate the pathogenicity of disease alleles. Additionally, we made publically available an in silico module for the prediction of CPDs and modifier sites.
These studies have advanced the ability to interpret the pathogenicity of multiple types of human variation, as well as made available tools for others to do so as well.