969 resultados para lab assignment


Relevância:

20.00% 20.00%

Publicador:

Resumo:

As traffic congestion continues to worsen in large urban areas, solutions are urgently sought. However, transportation planning models, which estimate traffic volumes on transportation network links, are often unable to realistically consider travel time delays at intersections. Introducing signal controls in models often result in significant and unstable changes in network attributes, which, in turn, leads to instability of models. Ignoring the effect of delays at intersections makes the model output inaccurate and unable to predict travel time. To represent traffic conditions in a network more accurately, planning models should be capable of arriving at a network solution based on travel costs that are consistent with the intersection delays due to signal controls. This research attempts to achieve this goal by optimizing signal controls and estimating intersection delays accordingly, which are then used in traffic assignment. Simultaneous optimization of traffic routing and signal controls has not been accomplished in real-world applications of traffic assignment. To this end, a delay model dealing with five major types of intersections has been developed using artificial neural networks (ANNs). An ANN architecture consists of interconnecting artificial neurons. The architecture may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The ANN delay model has been trained using extensive simulations based on TRANSYT-7F signal optimizations. The delay estimates by the ANN delay model have percentage root-mean-squared errors (%RMSE) that are less than 25.6%, which is satisfactory for planning purposes. Larger prediction errors are typically associated with severely oversaturated conditions. A combined system has also been developed that includes the artificial neural network (ANN) delay estimating model and a user-equilibrium (UE) traffic assignment model. The combined system employs the Frank-Wolfe method to achieve a convergent solution. Because the ANN delay model provides no derivatives of the delay function, a Mesh Adaptive Direct Search (MADS) method is applied to assist in and expedite the iterative process of the Frank-Wolfe method. The performance of the combined system confirms that the convergence of the solution is achieved, although the global optimum may not be guaranteed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined assignment of withdrawal codes by school administrators in two disciplinary alternative schools. Findings revealed: (a) codes were inaccurately assigned intentionally to keep students from returning to a regular school without notification, and (b) administrators improperly tracked students and failed to ascertain students’ reasons for dropping out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paper Higher education, student affairs and lifelong learning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As traffic congestion continues to worsen in large urban areas, solutions are urgently sought. However, transportation planning models, which estimate traffic volumes on transportation network links, are often unable to realistically consider travel time delays at intersections. Introducing signal controls in models often result in significant and unstable changes in network attributes, which, in turn, leads to instability of models. Ignoring the effect of delays at intersections makes the model output inaccurate and unable to predict travel time. To represent traffic conditions in a network more accurately, planning models should be capable of arriving at a network solution based on travel costs that are consistent with the intersection delays due to signal controls. This research attempts to achieve this goal by optimizing signal controls and estimating intersection delays accordingly, which are then used in traffic assignment. Simultaneous optimization of traffic routing and signal controls has not been accomplished in real-world applications of traffic assignment. To this end, a delay model dealing with five major types of intersections has been developed using artificial neural networks (ANNs). An ANN architecture consists of interconnecting artificial neurons. The architecture may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The ANN delay model has been trained using extensive simulations based on TRANSYT-7F signal optimizations. The delay estimates by the ANN delay model have percentage root-mean-squared errors (%RMSE) that are less than 25.6%, which is satisfactory for planning purposes. Larger prediction errors are typically associated with severely oversaturated conditions. A combined system has also been developed that includes the artificial neural network (ANN) delay estimating model and a user-equilibrium (UE) traffic assignment model. The combined system employs the Frank-Wolfe method to achieve a convergent solution. Because the ANN delay model provides no derivatives of the delay function, a Mesh Adaptive Direct Search (MADS) method is applied to assist in and expedite the iterative process of the Frank-Wolfe method. The performance of the combined system confirms that the convergence of the solution is achieved, although the global optimum may not be guaranteed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing useof nanomaterials in consumer products and biomedical applications creates the possibilities of intentional/unintentional exposure to humans and the environment. Beyond the physiological limit, the nanomaterialexposure to humans can induce toxicity. It is difficult to define toxicity of nanoparticles on humans as it varies by nanomaterialcomposition, size, surface properties and the target organ/cell line. Traditional tests for nanomaterialtoxicity assessment are mostly based on bulk-colorimetric assays. In many studies, nanomaterials have found to interfere with assay-dye to produce false results and usually require several hours or days to collect results. Therefore, there is a clear need for alternative tools that can provide accurate, rapid, and sensitive measure of initial nanomaterialscreening. Recent advancement in single cell studies has suggested discovering cell properties not found earlier in traditional bulk assays. A complex phenomenon, like nanotoxicity, may become clearer when studied at the single cell level, including with small colonies of cells. Advances in lab-on-a-chip techniques have played a significant role in drug discoveries and biosensor applications, however, rarely explored for nanomaterialtoxicity assessment. We presented such cell-integrated chip-based approach that provided quantitative and rapid response of cellhealth, through electrochemical measurements. Moreover, the novel design of the device presented in this study was capable of capturing and analyzing the cells at a single cell and small cell-population level. We examined the change in exocytosis (i.e. neurotransmitterrelease) properties of a single PC12 cell, when exposed to CuOand TiO2 nanoparticles. We found both nanomaterials to interfere with the cell exocytosis function. We also studied the whole-cell response of a single-cell and a small cell-population simultaneously in real-time for the first time. The presented study can be a reference to the future research in the direction of nanotoxicity assessment to develop miniature, simple, and cost-effective tool for fast, quantitative measurements at high throughput level. The designed lab-on-a-chip device and measurement techniques utilized in the present work can be applied for the assessment of othernanoparticles' toxicity, as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integer programming, simulation, and rules of thumb have been integrated to develop a simulation-based heuristic for short-term assignment of fleet in the car rental industry. It generates a plan for car movements, and a set of booking limits to produce high revenue for a given planning horizon. Three different scenarios were used to validate the heuristic. The heuristic's mean revenue was significant higher than the historical ones, in all three scenarios. Time to run the heuristic for each experiment was within the time limits of three hours set for the decision making process even though it is not fully automated. These findings demonstrated that the heuristic provides better plans (plans that yield higher profit) for the dynamic allocation of fleet than the historical decision processes. Another contribution of this effort is the integration of IP and rules of thumb to search for better performance under stochastic conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Predicting the impacts of environmental change on marine organisms, food webs, and biogeochemical cycles presently relies almost exclusively on short-term physiological studies, while the possibility of adaptive evolution is often ignored. Here, we assess adaptive evolution in the coccolithophore Emiliania huxleyi, a well-established model species in biological oceanography, in response to ocean acidification. We previously demonstrated that this globally important marine phytoplankton species adapts within 500 generations to elevated CO2. After 750 and 1000 generations, no further fitness increase occurred, and we observed phenotypic convergence between replicate populations. We then exposed adapted populations to two novel environments to investigate whether or not the underlying basis for high CO2-adaptation involves functional genetic divergence, assuming that different novel mutations become apparent via divergent pleiotropic effects. The novel environment "high light" did not reveal such genetic divergence whereas growth in a low-salinity environment revealed strong pleiotropic effects in high CO2 adapted populations, indicating divergent genetic bases for adaptation to high CO2. This suggests that pleiotropy plays an important role in adaptation of natural E. huxleyi populations to ocean acidification. Our study highlights the potential mutual benefits for oceanography and evolutionary biology of using ecologically important marine phytoplankton for microbial evolution experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliable dating of glaciomarine sediments deposited on the Antarctic shelf since the Last Glacial Maximum (LGM) is very challenging because of the general absence of calcareous (micro-) fossils and the recycling of fossil organic matter. As a consequence, radiocarbon (14C) ages of the acid-insoluble organic fraction (AIO) of the sediments bear uncertainties that are very difficult to quantify. In this paper we present the results of three different chronostratigraphic methods to date a sedimentary unit consisting of diatomaceous ooze and diatomaceous mud that was deposited following the last deglaciation at five core sites on the inner shelf in the western Amundsen Sea (West Antarctica). In three cores conventional 14C dating of the AIO in bulk sediment samples yielded age reversals down-core, but at all sites the AIO 14C ages obtained from diatomaceous ooze within the diatom-rich unit yielded similar uncorrected 14C ages ranging from 13,517±56 to 11,543±47 years before present (yr BP). Correction of these ages by subtracting the core-top ages, which are assumed to reflect present-day deposition (as indicated by 21044 Pb dating of the sediment surface at one core site), yielded ages between ca. 10,500 and 8,400 calibrated years before present (cal yr BP). Correction of the AIO ages of the diatomaceous ooze by only subtracting the marine reservoir effect (MRE) of 1,300 years indicated deposition of the diatom-rich sediments between 14,100 and 11,900 cal yr BP. Most of these ages are consistent with age constraints between 13.0 and 8.0 ka BP for the diatom-rich unit, which we obtained by correlating the relative palaeomagnetic intensity (RPI) records of three of the sediment cores with global and regional reference curves for palaeomagnetic intensity. As a third dating technique we applied conventional 53 radiocarbon dating of the AIO included in acid-cleaned diatom hard parts that were extracted from the diatomaceous ooze. This method yielded uncorrected 14C ages of only 5,111±38 and 5,106±38 yr BP, respectively. We reject these young ages, because they are likely to be overprinted by the adsorption of modern atmospheric carbon dioxide onto the surfaces of the extracted diatom hard parts prior to sample graphitisation and combustion for 14C dating. The deposition of the diatom-rich unit in the western Amundsen Sea suggests deglaciation of the inner shelf before ca. 13 ka BP. The deposition of diatomaceous oozes on other parts of the Antarctic shelf around the same time, however, seems to be coincidental rather than directly related.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acknowledgements We thank Brian Roberts and Mike Harris for responding to our questions regarding their paper; Zoltan Dienes for advice on Bayes factors; Denise Fischer, Melanie Römer, Ioana Stanciu, Aleksandra Romanczuk, Stefano Uccelli, Nuria Martos Sánchez, and Rosa María Beño Ruiz de la Sierra for help collecting data; Eva Viviani for managing data collection in Parma. We thank Maurizio Gentilucci for letting us use his lab, and the Centro Intradipartimentale Mente e Cervello (CIMeC), University of Trento, and especially Francesco Pavani for lending us his motion tracking equipment. We thank Rachel Foster for proofreading. KKK was supported by a Ph.D. scholarship as part of a grant to VHF within the International Graduate Research Training Group on Cross-Modal Interaction in Natural and Artificial Cognitive Systems (CINACS; DFG IKG-1247) and TS by a grant (DFG – SCHE 735/3-1); both from the German Research Council.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General note: Title and date provided by Bettye Lane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This presentation showcases the application of a university-based education research lab (ERL) model to the evaluation of a community sailing program for individuals with disabilities. Presenters conceptualize the ERL model as a mutually beneficial relationship between universities and community education agencies.