856 resultados para Time-to-collision
Resumo:
We consider the problems of finding two optimal triangulations of a convex polygon: MaxMin area and MinMax area. These are the triangulations that maximize the area of the smallest area triangle in a triangulation, and respectively minimize the area of the largest area triangle in a triangulation, over all possible triangulations. The problem was originally solved by Klincsek by dynamic programming in cubic time [2]. Later, Keil and Vassilev devised an algorithm that runs in O(n^2 log n) time [1]. In this paper we describe new geometric findings on the structure of MaxMin and MinMax Area triangulations of convex polygons in two dimensions and their algorithmic implications. We improve the algorithm’s running time to quadratic for large classes of convex polygons. We also present experimental results on MaxMin area triangulation.
Resumo:
As an indicator of global change and shifting balances of power, every September in Dalian, China, the World Economic Forum meets. The subject in 2011 – Mastering Quality Growth. On the agenda is pursuing new frontiers of growth linked to embracing disruptive innovation. With growth coming from emerging markets, and European and North American economies treading water, many firms in the West are facing the reality of having to not just downsize but actually close manufacturing operations and re-open them elsewhere, where costs are lower, to remain competitive. There are thousands of books on “change management”. Yet very few of these devote much time to downsizing preferring to talk about re-engineering or restructuring. What lessons are available from the past to achieve a positive outcome from what will inevitably be something of a human, as well as an economic, tragedy. The authors reached three fundamental conclusions from their experience and research in facility closure management within Vauxhall, UK: put your people first, make sure you keep running the business and manage your legacy. They devlop the ideas into a new business model linked to the emotions of change.
Resumo:
A heuristic for batching orders in a manual order-picking warehouse has been developed. It prioritizes orders based on due time to prevent mixing of orders of different priority levels. The order density of aisles criterion is used to form batches. It also determines the number of pickers required and assigns batches to pickers such that there is a uniform workload per unit of time. The effectiveness of the heuristic was studied by observing computational time and aisle congestion for various numbers of total orders and number of orders that form a batch. An initial heuristic performed well for small number of orders, but for larger number of orders, a partitioning technique is computationally more efficient, needing only minutes to solve for thousands of orders, while preserving 90% of the batch quality obtained with the original heuristic. Comparative studies between the heuristic and other published heuristics are needed. ^
Resumo:
In recent years, the internet has grown exponentially, and become more complex. This increased complexity potentially introduces more network-level instability. But for any end-to-end internet connection, maintaining the connection's throughput and reliability at a certain level is very important. This is because it can directly affect the connection's normal operation. Therefore, a challenging research task is to improve a network's connection performance by optimizing its throughput and reliability. This dissertation proposed an efficient and reliable transport layer protocol (called concurrent TCP (cTCP)), an extension of the current TCP protocol, to optimize end-to-end connection throughput and enhance end-to-end connection fault tolerance. The proposed cTCP protocol could aggregate multiple paths' bandwidth by supporting concurrent data transfer (CDT) on a single connection. Here concurrent data transfer was defined as the concurrent transfer of data from local hosts to foreign hosts via two or more end-to-end paths. An RTT-Based CDT mechanism, which was based on a path's RTT (Round Trip Time) to optimize CDT performance, was developed for the proposed cTCP protocol. This mechanism primarily included an RTT-Based load distribution and path management scheme, which was used to optimize connections' throughput and reliability. A congestion control and retransmission policy based on RTT was also provided. According to experiment results, under different network conditions, our RTT-Based CDT mechanism could acquire good CDT performance. Finally a CWND-Based CDT mechanism, which was based on a path's CWND (Congestion Window), to optimize CDT performance was introduced. This mechanism primarily included: a CWND-Based load allocation scheme, which assigned corresponding data to paths based on their CWND to achieve aggregate bandwidth; a CWND-Based path management, which was used to optimize connections' fault tolerance; and a congestion control and retransmission management policy, which was similar to regular TCP in its separate path handling. According to corresponding experiment results, this mechanism could acquire near-optimal CDT performance under different network conditions.
Resumo:
An iterative travel time forecasting scheme, named the Advanced Multilane Prediction based Real-time Fastest Path (AMPRFP) algorithm, is presented in this dissertation. This scheme is derived from the conventional kernel estimator based prediction model by the association of real-time nonlinear impacts that caused by neighboring arcs’ traffic patterns with the historical traffic behaviors. The AMPRFP algorithm is evaluated by prediction of the travel time of congested arcs in the urban area of Jacksonville City. Experiment results illustrate that the proposed scheme is able to significantly reduce both the relative mean error (RME) and the root-mean-squared error (RMSE) of the predicted travel time. To obtain high quality real-time traffic information, which is essential to the performance of the AMPRFP algorithm, a data clean scheme enhanced empirical learning (DCSEEL) algorithm is also introduced. This novel method investigates the correlation between distance and direction in the geometrical map, which is not considered in existing fingerprint localization methods. Specifically, empirical learning methods are applied to minimize the error that exists in the estimated distance. A direction filter is developed to clean joints that have negative influence to the localization accuracy. Synthetic experiments in urban, suburban and rural environments are designed to evaluate the performance of DCSEEL algorithm in determining the cellular probe’s position. The results show that the cellular probe’s localization accuracy can be notably improved by the DCSEEL algorithm. Additionally, a new fast correlation technique for overcoming the time efficiency problem of the existing correlation algorithm based floating car data (FCD) technique is developed. The matching process is transformed into a 1-dimensional (1-D) curve matching problem and the Fast Normalized Cross-Correlation (FNCC) algorithm is introduced to supersede the Pearson product Moment Correlation Co-efficient (PMCC) algorithm in order to achieve the real-time requirement of the FCD method. The fast correlation technique shows a significant improvement in reducing the computational cost without affecting the accuracy of the matching process.
Resumo:
World War II profoundly impacted Florida. The military geography of the State is essential to an understanding the war. The geostrategic concerns of place and space determined that Florida would become a statewide military base. Florida's attributes of place such as climate and topography determined its use as a military academy hosting over two million soldiers, nearly 15 percent of the GI Army, the largest force the US ever raised. One-in-eight Floridians went into uniform. Equally, Florida's space on the planet made it central for both defensive and offensive strategies. The Second World War was a war of movement, and Florida was a major jump off point for US force projection world-wide, especially of air power. Florida's demography facilitated its use as a base camp for the assembly and engagement of this military power. In 1940, less than two percent of the US population lived in Florida, a quiet, barely populated backwater of the United States. But owing to its critical place and space, over the next few years it became a 65,000 square mile training ground, supply dump, and embarkation site vital to the US war effort. Because of its place astride some of the most important sea lanes in the Atlantic World, Florida was the scene of one of the few Western Hemisphere battles of the war. The militarization of Florida began long before Pearl Harbor. The pre-war buildup conformed to the US strategy of the war. The strategy of theUS was then (and remains today) one of forward defense: harden the frontier, then take the battle to the enemy, rather than fight them in North America. The policy of "Europe First," focused the main US war effort on the defeat of Hitler's Germany, evaluated to be the most dangerous enemy. In Florida were established the military forces requiring the longest time to develop, and most needed to defeat the Axis. Those were a naval aviation force for sea-borne hostilities, a heavy bombing force for reducing enemy industrial states, and an aerial logistics train for overseas supply of expeditionary campaigns. The unique Florida coastline made possible the seaborne invasion training demanded for US victory. The civilian population was employed assembling mass-produced first-generation container ships, while Floridahosted casualties, Prisoners-of-War, and transient personnel moving between the Atlantic and Pacific. By the end of hostilities and the lifting of Unlimited Emergency, officially on December 31, 1946, Floridahad become a transportation nexus. Florida accommodated a return of demobilized soldiers, a migration of displaced persons, and evolved into a modern veterans' colonia. It was instrumental in fashioning the modern US military, while remaining a center of the active National Defense establishment. Those are the themes of this work.
Resumo:
Many certifications are available in many professions. They represent a level of achievement and provide a dimension of professionalism to a resume. This article reveals the results of research covering the degree the extent of certification of members of the Hospitality Financial & Technology Professionals. Further, obstacles and assists in taking the examination to become a Certified Hospitality Accountant Executive (CHAE) were determined. Thirty-seven percent of the respondents have earned their CHAE. The biggest obstacle to taking the exam according to 60% of the respondents who have not earned the CHAE was lack of time to prepare. The biggest assist according to this same group would be an on-line CHAE preparation course.
Resumo:
During the past decade, there has been a dramatic increase by postsecondary institutions in providing academic programs and course offerings in a multitude of formats and venues (Biemiller, 2009; Kucsera & Zimmaro, 2010; Lang, 2009; Mangan, 2008). Strategies pertaining to reapportionment of course-delivery seat time have been a major facet of these institutional initiatives; most notably, within many open-door 2-year colleges. Often, these enrollment-management decisions are driven by the desire to increase market-share, optimize the usage of finite facility capacity, and contain costs, especially during these economically turbulent times. So, while enrollments have surged to the point where nearly one in three 18-to-24 year-old U.S. undergraduates are community college students (Pew Research Center, 2009), graduation rates, on average, still remain distressingly low (Complete College America, 2011). Among the learning-theory constructs related to seat-time reapportionment efforts is the cognitive phenomenon commonly referred to as the spacing effect, the degree to which learning is enhanced by a series of shorter, separated sessions as opposed to fewer, more massed episodes. This ex post facto study explored whether seat time in a postsecondary developmental-level algebra course is significantly related to: course success; course-enrollment persistence; and, longitudinally, the time to successfully complete a general-education-level mathematics course. Hierarchical logistic regression and discrete-time survival analysis were used to perform a multi-level, multivariable analysis of a student cohort (N = 3,284) enrolled at a large, multi-campus, urban community college. The subjects were retrospectively tracked over a 2-year longitudinal period. The study found that students in long seat-time classes tended to withdraw earlier and more often than did their peers in short seat-time classes (p < .05). Additionally, a model comprised of nine statistically significant covariates (all with p-values less than .01) was constructed. However, no longitudinal seat-time group differences were detected nor was there sufficient statistical evidence to conclude that seat time was predictive of developmental-level course success. A principal aim of this study was to demonstrate—to educational leaders, researchers, and institutional-research/business-intelligence professionals—the advantages and computational practicability of survival analysis, an underused but more powerful way to investigate changes in students over time.
Resumo:
This study investigated time-use of elementary music teachers and elementary classroom teachers to determine: (1) whether there was a relationship between grade level, time of day, and day of the week and teachers' time-use in teaching, monitoring, and non-curricular, and (2) whether ethnicity, training, and years of experience affect teacher time-use. Sixty-nine music teachers and 55 classroom teachers participated. ^ A MANOVA was used to examine the hypothesized relationship. ANOVA results were significant for time spent teaching, monitoring, and non-curricular. An independent t test revealed a significance difference (t (302) = 5.20, p < .001) between the two groups of teachers. A significant difference was found for teaching, t (302) = 5.20, p < .001: music teachers spent more time actively teaching than did classroom teachers. There was a significant difference for monitoring (t (302) = 13.62, p < .001): classroom teachers allocated more time to monitoring than did music teachers. A significant difference was also found for non-curricular (t (302) = 7.03, p < .001): music teachers spent more time in this category of activities than did classroom teachers. ^ Analyses of the activities subsumed under the major categories indicated significant differences between elementary music teachers and elementary classroom teachers, overall, in subject matter (p < .001), discussion (p < .05), school-wide activities (p < .001), seatwork (p < .001), giving directions (p < .001), changing activities (p < .001), lunch (p < .05), planning (p < .001) and interruption (p < .001). Analyses of the relationship and ethnicity, training, degree, experience indicated significant difference for main effect, ethnicity (F(2, 116) = 4.22, p < .017). Time-use for black non-Hispanic teachers was higher than time-use for those who were Hispanic and white non-Hispanic. ^ Analyses of time-use by grade showed no increase for either group as grade level increased. A statistically significant Wilks Lambda ( F (1,294) = .917 p < .013) was found for the independent variable day of the week. ANOVA indicated that elementary classroom teachers monitored more on Thursdays and Fridays: music teachers allocated more time to non-curricular activities on Fridays. ^
Resumo:
The standard highway assignment model in the Florida Standard Urban Transportation Modeling Structure (FSUTMS) is based on the equilibrium traffic assignment method. This method involves running several iterations of all-or-nothing capacity-restraint assignment with an adjustment of travel time to reflect delays encountered in the associated iteration. The iterative link time adjustment process is accomplished through the Bureau of Public Roads (BPR) volume-delay equation. Since FSUTMS' traffic assignment procedure outputs daily volumes, and the input capacities are given in hourly volumes, it is necessary to convert the hourly capacities to their daily equivalents when computing the volume-to-capacity ratios used in the BPR function. The conversion is accomplished by dividing the hourly capacity by a factor called the peak-to-daily ratio, or referred to as CONFAC in FSUTMS. The ratio is computed as the highest hourly volume of a day divided by the corresponding total daily volume. ^ While several studies have indicated that CONFAC is a decreasing function of the level of congestion, a constant value is used for each facility type in the current version of FSUTMS. This ignores the different congestion level associated with each roadway and is believed to be one of the culprits of traffic assignment errors. Traffic counts data from across the state of Florida were used to calibrate CONFACs as a function of a congestion measure using the weighted least squares method. The calibrated functions were then implemented in FSUTMS through a procedure that takes advantage of the iterative nature of FSUTMS' equilibrium assignment method. ^ The assignment results based on constant and variable CONFACs were then compared against the ground counts for three selected networks. It was found that the accuracy from the two assignments was not significantly different, that the hypothesized improvement in assignment results from the variable CONFAC model was not empirically evident. It was recognized that many other factors beyond the scope and control of this study could contribute to this finding. It was recommended that further studies focus on the use of the variable CONFAC model with recalibrated parameters for the BPR function and/or with other forms of volume-delay functions. ^
Resumo:
Many certifications are available in many professions. They represent a level of achievement and provide a dimension of professionalism to a resume. This article reveals the results of research covering the degree the extent of certification of members of the Hospitality Financial & Technology Professionals. Further, obstacles and assists in taking the examination to become a Certified Hospitality Accountant Executive (CHAE) were determined. Thirty-seven percent of the respondents have earned their CHAE. The biggest obstacle to taking the exam according to 60% of the respondents who have not earned the CHAE was lack of time to prepare. The biggest assist according to this same group would be an on-line CHAE preparation course.
Resumo:
More information is now readily available to computer users than at any time in human history; however, much of this information is often inaccessible to people with blindness or low-vision, for whom information must be presented non-visually. Currently, screen readers are able to verbalize on-screen text using text-to-speech (TTS) synthesis; however, much of this vocalization is inadequate for browsing the Internet. An auditory interface that incorporates auditory-spatial orientation was created and tested. For information that can be structured as a two-dimensional table, links can be semantically grouped as cells in a row within an auditory table, which provides a consistent structure for auditory navigation. An auditory display prototype was tested.^ Sixteen legally blind subjects participated in this research study. Results demonstrated that stereo panning was an effective technique for audio-spatially orienting non-visual navigation in a five-row, six-column HTML table as compared to a centered, stationary synthesized voice. These results were based on measuring the time- to-target (TTT), or the amount of time elapsed from the first prompting to the selection of each tabular link. Preliminary analysis of the TTT values recorded during the experiment showed that the populations did not conform to the ANOVA requirements of normality and equality of variances. Therefore, the data were transformed using the natural logarithm. The repeated-measures two-factor ANOVA results show that the logarithmically-transformed TTTs were significantly affected by the tonal variation method, F(1,15) = 6.194, p= 0.025. Similarly, the results show that the logarithmically transformed TTTs were marginally affected by the stereo spatialization method, F(1,15) = 4.240, p=0.057. The results show that the logarithmically transformed TTTs were not significantly affected by the interaction of both methods, F(1,15) = 1.381, p=0.258. These results suggest that some confusion may be caused in the subject when employing both of these methods simultaneously. The significant effect of tonal variation indicates that the effect is actually increasing the average TTT. In other words, the presence of preceding tones increases task completion time on average. The marginally-significant effect of stereo spatialization decreases the average log(TTT) from 2.405 to 2.264.^
Resumo:
This study investigated time-use of elementary music teachers and elementary classroom teachers to determine: (1) whether there was a relationship between grade level, time of day, and day of the week and teachers' time-use in teaching, monitoring, and non-curricular, and (2) whether ethnicity, training, and years of experience affect teacher time-use. Sixty-nine music teachers and 55 classroom teachers participated. A MANOVA was used to examine the hypothesized relationship. ANOVA results were significant for time spent teaching, monitoring, and non-curricular. An independent t test revealed a significance difference (t (302) = 5.20, p Analyses of the activities subsumed under the major categories indicated significant differences between elementary music teachers and elementary classroom teachers, overall, in subject matter ( p teachers was higher than time-use for those who were Hispanic and white non-Hispanic. Analyses of time-use by grade showed no increase for either group as grade level increased. A statistically significant Wilks Lambda ( F (1,294) = .917 p < .013 ) was found for the independent variable day of the week. ANOVA indicated that elementary classroom teachers monitored more on Thursdays and Fridays: music teachers allocated more time to non-curricular activities on Fridays.
Resumo:
The population of naive T cells in the periphery is best described by determining both its T cell receptor diversity, or number of clonotypes, and the sizes of its clonal subsets. In this paper, we make use of a previously introduced mathematical model of naive T cell homeostasis, to study the fate and potential of naive T cell clonotypes in the periphery. This is achieved by the introduction of several new stochastic descriptors for a given naive T cell clonotype, such as its maximum clonal size, the time to reach this maximum, the number of proliferation events required to reach this maximum, the rate of contraction of the clonotype during its way to extinction, as well as the time to a given number of proliferation events. Our results show that two fates can be identified for the dynamics of the clonotype: extinction in the short-term if the clonotype experiences too hostile a peripheral environment, or establishment in the periphery in the long-term. In this second case the probability mass function for the maximum clonal size is bimodal, with one mode near one and the other mode far away from it. Our model also indicates that the fate of a recent thymic emigrant (RTE) during its journey in the periphery has a clear stochastic component, where the probability of extinction cannot be neglected, even in a friendly but competitive environment. On the other hand, a greater deterministic behaviour can be expected in the potential size of the clonotype seeded by the RTE in the long-term, once it escapes extinction.
Resumo:
Foundational cellular immunology research of the 1960s and 1970s, together with the advent of monoclonal antibodies and flow cytometry, provided the knowledge base and the technological capability that enabled the elucidation of the role of CD4 T cells in HIV infection. Research identifying the sources and magnitude of variation in CD4 measurements, standardized reagents and protocols, and the development of clinical flow cytometers all contributed to the feasibility of widespread CD4 testing. Cohort studies and clinical trials provided the context for establishing the utility of CD4 for prognosis in HIV-infected persons, initial assessment of in vivo antiretroviral drug activity, and as a surrogate marker for clinical outcome in antiretroviral therapeutic trials. Even with sensitive HIV viral load measurement, CD4 cell counting is still utilized in determining antiretroviral therapy eligibility and time to initiate therapy. New point of care technologies are helping both to lower the cost of CD4 testing and enable its use in HIV test and treat programs around the world.