149 resultados para Computer systems organization: general-emerging technologies
Resumo:
This paper reviews innovation activity in a key service industry – road and bridge construction. Based on a large-scale Australian survey and descriptive statistics, the paper finds that there is little difference in innovation levels between different types of industry participants and that innovation is difficult to implement. The survey gathered responses from suppliers, consultants, contracts and clients and compared results across these four industry sectors. The absorptive capacity and relationship capacities of respondents were also investigated. One in five respondents had poor absorptive capacity. Suppliers were found to the most effective learners and were the best adopters of ideas from outside their organisations and consultants were the least effective. Australian construction organisations have relatively good relationship skills because relationship-based contracts are common compared to other countries. Indeed, the survey found that nearly 60% of respondents had experience with such contracts, with clients having more experience than the other three sectors. The results have implications for the measurement of innovation in project-based industries, and the relative roles of clients and suppliers in driving innovation in the construction industry. Further research will examine the extent to which particular governance mechanisms within relationship contracts lead to improved innovation and project performance.
Resumo:
The world’s increasing complexity, competitiveness, interconnectivity, and dependence on technology generate new challenges for nations and individuals that cannot be met by continuing education as usual. With the proliferation of complex systems have come new technologies for communication, collaboration, and conceptualisation. These technologies have led to signifi cant changes in the forms of mathematical and scientifi c thinking required beyond the classroom. Modelling, in its various forms, can develop and broaden students’ mathematical and scientific thinking beyond the standard curriculum. This chapter first considers future competencies in the mathematical sciences within an increasingly complex world. Consideration is then given to interdisciplinary problem solving and models and modelling, as one means of addressing these competencies. Illustrative case studies involving complex, interdisciplinary modelling activities in Years 1 and 7 are presented.
Resumo:
To enhance the performance of the k-nearest neighbors approach in forecasting short-term traffic volume, this paper proposed and tested a two-step approach with the ability of forecasting multiple steps. In selecting k-nearest neighbors, a time constraint window is introduced, and then local minima of the distances between the state vectors are ranked to avoid overlappings among candidates. Moreover, to control extreme values’ undesirable impact, a novel algorithm with attractive analytical features is developed based on the principle component. The enhanced KNN method has been evaluated using the field data, and our comparison analysis shows that it outperformed the competing algorithms in most cases.
Resumo:
The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc. The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc.
Resumo:
Loop detectors are the oldest and widely used traffic data source. On urban arterials, they are mainly installed for signal control. Recently state of the art Bluetooth MAC Scanners (BMS) has significantly captured the interest of stakeholders for exploiting it for area wide traffic monitoring. Loop detectors provide flow- a fundamental traffic parameter; whereas BMS provides individual vehicle travel time between BMS stations. Hence, these two data sources complement each other, and if integrated should increase the accuracy and reliability of the traffic state estimation. This paper proposed a model that integrates loops and BMS data for seamless travel time and density estimation for urban signalised network. The proposed model is validated using both real and simulated data and the results indicate that the accuracy of the proposed model is over 90%.
Resumo:
Whole System Design is increasingly being seen as one of the most cost effective ways to both increase the productivity and reduce the negative environmental impacts of an engineered system. A focus on design is critical, as the output from this stage of the project locks-in most of the economic and environmental performance of the designed system throughout its life, which can span from a few years to many decades. Indeed, it is now widely acknowledged that all designers – particularly engineers, architects and industrial designers – need to be able to understand and implement a whole system design approach. This book provides a clear design methodology, based on leading efforts in the field, and is supported by worked examples that demonstrate how advances in energy, materials and water productivity can be achieved through applying an integrated approach to sustainable engineering. Chapters 1–5 outline the approach and explain how it can be implemented to enhance the established Systems Engineering framework. Chapters 6–10 demonstrate, through detailed worked examples, the application of the approach to industrial pumping systems, passenger vehicles, electronics and computer systems, temperature control of buildings, and domestic water systems.
Resumo:
We introduce a new mechanism for the propulsion and separation by chirality of small ferromagnetic particles suspended in a liquid. Under the action of a uniform dc magnetic field H and an ac electric field E isomers with opposite chirality move in opposite directions. Such a mechanism could have a significant impact on a wide range of emerging technologies. The component of the chiral velocity that is odd in H is found to be proportional to the intrinsic orbital and spin angular momentum of the magnetized electrons. This effect arises because a ferromagnetic particle responds to the applied torque as a small gyroscope. © 2012 American Physical Society.
Resumo:
Over the past decades there has been a considerable development in the modeling of car-following (CF) behavior as a result of research undertaken by both traffic engineers and traffic psychologists. While traffic engineers seek to understand the behavior of a traffic stream, traffic psychologists seek to describe the human abilities and errors involved in the driving process. This paper provides a comprehensive review of these two research streams. It is necessary to consider human-factors in {CF} modeling for a more realistic representation of {CF} behavior in complex driving situations (for example, in traffic breakdowns, crash-prone situations, and adverse weather conditions) to improve traffic safety and to better understand widely-reported puzzling traffic flow phenomena, such as capacity drop, stop-and-go oscillations, and traffic hysteresis. While there are some excellent reviews of {CF} models available in the literature, none of these specifically focuses on the human factors in these models. This paper addresses this gap by reviewing the available literature with a specific focus on the latest advances in car-following models from both the engineering and human behavior points of view. In so doing, it analyses the benefits and limitations of various models and highlights future research needs in the area.
Resumo:
A sound understanding of travellers’ behavioural changes and adaptation when facing a natural disaster is a key factor in efficiently and effectively managing transport networks at such times. This study specifically investigates the importance of travel/traffic information and its impact on travel behaviour during natural disasters. Using the 2011 Brisbane flood as a case study, survey respondents’ perceptions of the importance of travel/traffic information before, during, and after the flood were modelled using random-effects ordered logit. A hysteresis phenomenon was observed: respondents’ perceptions of the importance of travel/traffic information increased during the flood, and although its perceived importance decreased after the flood, it did not return to the pre-flood level. Results also reveal that socio-demographic features (such as gender and age) have a significant impact on respondents’ perceptions of the importance of travel/traffic information. The roles of travel time and safety in a respondent’s trip planning are also significantly correlated to their perception of the importance of this information. The analysis further shows that during the flood, respondents generally thought that travel/traffic information was important, and adjusted their travel plans according to information received. When controlling for other factors, the estimated odds of changing routes and cancelling trips for a respondent who thought that travel/traffic information was important, are respectively about three times and seven times the estimated odds for a respondent who thought that travel/traffic information was not important. In contrast, after the flood, the influence of travel/traffic information on respondents’ travel behaviour diminishes. Finally, the analysis shows no evidence of the influence of travel/traffic information’s on respondents’ travel mode; this indicates that inducing travel mode change is a challenging task.
Resumo:
Smart Card Automated Fare Collection (AFC) data has been extensively exploited to understand passenger behavior, passenger segment, trip purpose and improve transit planning through spatial travel pattern analysis. The literature has been evolving from simple to more sophisticated methods such as from aggregated to individual travel pattern analysis, and from stop-to-stop to flexible stop aggregation. However, the issue of high computing complexity has limited these methods in practical applications. This paper proposes a new algorithm named Weighted Stop Density Based Scanning Algorithm with Noise (WS-DBSCAN) based on the classical Density Based Scanning Algorithm with Noise (DBSCAN) algorithm to detect and update the daily changes in travel pattern. WS-DBSCAN converts the classical quadratic computation complexity DBSCAN to a problem of sub-quadratic complexity. The numerical experiment using the real AFC data in South East Queensland, Australia shows that the algorithm costs only 0.45% in computation time compared to the classical DBSCAN, but provides the same clustering results.
Resumo:
Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.
Resumo:
Patent law is a regime of intellectual property, which provides exclusive rights regarding scientific inventions, which are novel, inventive, and useful. There has been much debate over the limits of patentable subject matter relating to emerging technologies. The Supreme Court of the US has sought to rein in the expansive interpretation of patentability by lower courts in a series of cases dealing with medical information (Prometheus), finance (Bilski), and gene patents (Myriad). This has led to a reinvigoration of the debate over the boundaries of patentable subject matter. There has been controversy about the rise in patenting of geoengineering - particularly by firms such as Intellectual Ventures.
Resumo:
Monitoring pedestrian and cyclists movement is an important area of research in transport, crowd safety, urban design and human behaviour assessment areas. Media Access Control (MAC) address data has been recently used as potential information for extracting features from people’s movement. MAC addresses are unique identifiers of WiFi and Bluetooth wireless technologies in smart electronics devices such as mobile phones, laptops and tablets. The unique number of each WiFi and Bluetooth MAC address can be captured and stored by MAC address scanners. MAC addresses data in fact allows for unannounced, non-participatory, and tracking of people. The use of MAC data for tracking people has been focused recently for applying in mass events, shopping centres, airports, train stations etc. In terms of travel time estimation, setting up a scanner with a big value of antenna’s gain is usually recommended for highways and main roads to track vehicle’s movements, whereas big gains can have some drawbacks in case of pedestrian and cyclists. Pedestrian and cyclists mainly move in built distinctions and city pathways where there is significant noises from other fixed WiFi and Bluetooth. Big antenna’s gains will cover wide areas that results in scanning more samples from pedestrians and cyclists’ MAC device. However, anomalies (such fixed devices) may be captured that increase the complexity and processing time of data analysis. On the other hand, small gain antennas will have lesser anomalies in the data but at the cost of lower overall sample size of pedestrian and cyclist’s data. This paper studies the effect of antenna characteristics on MAC address data in terms of travel-time estimation for pedestrians and cyclists. The results of the empirical case study compare the effects of small and big antenna gains in order to suggest optimal set up for increasing the accuracy of pedestrians and cyclists’ travel-time estimation.
Resumo:
Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.
Resumo:
Recent advancements in the area of organic polymer applications demand novel and advanced materials with desirable surface, optical and electrical properties to employ in emerging technologies. This study examines the fabrication and characterization of polymer thin films from non-synthetic Terpinen-4-ol monomer using radio frequency plasma polymerization. The optical properties, thickness and roughness of the thin films were studied in the wavelength range 200–1000 nm using ellipsometry. The polymer thin films of thickness from 100 nm to 1000 nm were fabricated and the films exhibited smooth and defect-free surfaces. At 500 nm wavelength, the refractive index and extinction coefficient were found to be 1.55 and 0.0007 respectively. The energy gap was estimated to be 2.67 eV, the value falling into the semiconducting Eg region. The obtained optical and surface properties of Terpinen-4-ol based films substantiate their candidacy as a promising low-cost material with potential applications in electronics, optics, and biomedical industries.