959 resultados para Emerging Technologies Roundtable


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Map-matching algorithms that utilise road segment connectivity along with other data (i.e.position, speed and heading) in the process of map-matching are normally suitable for high frequency (1 Hz or higher) positioning data from GPS. While applying such map-matching algorithms to low frequency data (such as data from a fleet of private cars, buses or light duty vehicles or smartphones), the performance of these algorithms reduces to in the region of 70% in terms of correct link identification, especially in urban and sub-urban road networks. This level of performance may be insufficient for some real-time Intelligent Transport System (ITS) applications and services such as estimating link travel time and speed from low frequency GPS data. Therefore, this paper develops a new weight-based shortest path and vehicle trajectory aided map-matching (stMM) algorithm that enhances the map-matching of low frequency positioning data on a road map. The well-known A* search algorithm is employed to derive the shortest path between two points while taking into account both link connectivity and turn restrictions at junctions. In the developed stMM algorithm, two additional weights related to the shortest path and vehicle trajectory are considered: one shortest path-based weight is related to the distance along the shortest path and the distance along the vehicle trajectory, while the other is associated with the heading difference of the vehicle trajectory. The developed stMM algorithm is tested using a series of real-world datasets of varying frequencies (i.e. 1 s, 5 s, 30 s, 60 s sampling intervals). A high-accuracy integrated navigation system (a high-grade inertial navigation system and a carrier-phase GPS receiver) is used to measure the accuracy of the developed algorithm. The results suggest that the algorithm identifies 98.9% of the links correctly for every 30 s GPS data. Omitting the information from the shortest path and vehicle trajectory, the accuracy of the algorithm reduces to about 73% in terms of correct link identification. The algorithm can process on average 50 positioning fixes per second making it suitable for real-time ITS applications and services.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The purpose of this research is to explore the idea of the participatory library in higher education settings. This research aims to address the question, what is a participatory university library? Design/methodology/approach Grounded theory approach was adopted. In-depth individual interviews were conducted with two diverse groups of participants including ten library staff members and six library users. Data collection and analysis were carried out simultaneously and complied with Straussian grounded theory principles and techniques. Findings Three core categories representing the participatory library were found including “community”, “empowerment”, and “experience”. Each category was thoroughly delineated via sub-categories, properties, and dimensions that all together create a foundation for the participatory library. A participatory library model was also developed together with an explanation of model building blocks that provide a deeper understanding of the participatory library phenomenon. Research limitations The research focuses on a specific library system, i.e., academic libraries. Therefore, the research results may not be very applicable to public, special, and school library contexts. Originality/value This is the first empirical study developing a participatory library model. It provides librarians, library managers, researchers, library students, and the library community with a holistic picture of the contemporary library.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Patent law is a regime of intellectual property, which provides exclusive rights regarding scientific inventions, which are novel, inventive, and useful. There has been much debate over the limits of patentable subject matter relating to emerging technologies. The Supreme Court of the US has sought to rein in the expansive interpretation of patentability by lower courts in a series of cases dealing with medical information (Prometheus), finance (Bilski), and gene patents (Myriad). This has led to a reinvigoration of the debate over the boundaries of patentable subject matter. There has been controversy about the rise in patenting of geoengineering - particularly by firms such as Intellectual Ventures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, rapid advances in information technology have led to various data collection systems which are enriching the sources of empirical data for use in transport systems. Currently, traffic data are collected through various sensors including loop detectors, probe vehicles, cell-phones, Bluetooth, video cameras, remote sensing and public transport smart cards. It has been argued that combining the complementary information from multiple sources will generally result in better accuracy, increased robustness and reduced ambiguity. Despite the fact that there have been substantial advances in data assimilation techniques to reconstruct and predict the traffic state from multiple data sources, such methods are generally data-driven and do not fully utilize the power of traffic models. Furthermore, the existing methods are still limited to freeway networks and are not yet applicable in the urban context due to the enhanced complexity of the flow behavior. The main traffic phenomena on urban links are generally caused by the boundary conditions at intersections, un-signalized or signalized, at which the switching of the traffic lights and the turning maneuvers of the road users lead to shock-wave phenomena that propagate upstream of the intersections. This paper develops a new model-based methodology to build up a real-time traffic prediction model for arterial corridors using data from multiple sources, particularly from loop detectors and partial observations from Bluetooth and GPS devices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Monitoring pedestrian and cyclists movement is an important area of research in transport, crowd safety, urban design and human behaviour assessment areas. Media Access Control (MAC) address data has been recently used as potential information for extracting features from people’s movement. MAC addresses are unique identifiers of WiFi and Bluetooth wireless technologies in smart electronics devices such as mobile phones, laptops and tablets. The unique number of each WiFi and Bluetooth MAC address can be captured and stored by MAC address scanners. MAC addresses data in fact allows for unannounced, non-participatory, and tracking of people. The use of MAC data for tracking people has been focused recently for applying in mass events, shopping centres, airports, train stations etc. In terms of travel time estimation, setting up a scanner with a big value of antenna’s gain is usually recommended for highways and main roads to track vehicle’s movements, whereas big gains can have some drawbacks in case of pedestrian and cyclists. Pedestrian and cyclists mainly move in built distinctions and city pathways where there is significant noises from other fixed WiFi and Bluetooth. Big antenna’s gains will cover wide areas that results in scanning more samples from pedestrians and cyclists’ MAC device. However, anomalies (such fixed devices) may be captured that increase the complexity and processing time of data analysis. On the other hand, small gain antennas will have lesser anomalies in the data but at the cost of lower overall sample size of pedestrian and cyclist’s data. This paper studies the effect of antenna characteristics on MAC address data in terms of travel-time estimation for pedestrians and cyclists. The results of the empirical case study compare the effects of small and big antenna gains in order to suggest optimal set up for increasing the accuracy of pedestrians and cyclists’ travel-time estimation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The deployment of new emerging technologies, such as cooperative systems, allows the traffic community to foresee relevant improvements in terms of traffic safety and efficiency. Autonomous vehicles are able to share information about the local traffic state in real time, which could result in a better reaction to the mechanism of traffic jam formation. An upstream single-hop radio broadcast network can improve the perception of each cooperative driver within a specific radio range and hence the traffic stability. The impact of vehicle to vehicle cooperation on the onset of traffic congestion is investigated analytically and through simulation. A next generation simulation field dataset is used to calibrate the full velocity difference car-following model, and the MOBIL lane-changing model is implemented. The robustness of the calibration as well as the heterogeneity of the drivers is discussed. Assuming that congestion can be triggered either by the heterogeneity of drivers' behaviours or abnormal lane-changing behaviours, the calibrated car-following model is used to assess the impact of a microscopic cooperative law on egoistic lane-changing behaviours. The cooperative law can help reduce and delay traffic congestion and can have a positive effect on safety indicators.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Stability analyses have been widely used to better understand the mechanism of traffic jam formation. In this paper, we consider the impact of cooperative systems (a.k.a. connected vehicles) on traffic dynamics and, more precisely, on flow stability. Cooperative systems are emerging technologies enabling communication between vehicles and/or with the infrastructure. In a distributed communication framework, equipped vehicles are able to send and receive information to/from other equipped vehicles. Here, the effects of cooperative traffic are modeled through a general bilateral multianticipative car-following law that improves cooperative drivers' perception of their surrounding traffic conditions within a given communication range. Linear stability analyses are performed for a broad class of car-following models. They point out different stability conditions in both multianticipative and nonmultianticipative situations. To better understand what happens in unstable conditions, information on the shock wave structure is studied in the weakly nonlinear regime by the mean of the reductive perturbation method. The shock wave equation is obtained for generic car-following models by deriving the Korteweg de Vries equations. We then derive traffic-state-dependent conditions for the sign of the solitary wave (soliton) amplitude. This analytical result is verified through simulations. Simulation results confirm the validity of the speed estimate. The variation of the soliton amplitude as a function of the communication range is provided. The performed linear and weakly nonlinear analyses help justify the potential benefits of vehicle-integrated communication systems and provide new insights supporting the future implementation of cooperative systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Major advances in power electronics during recent years have prompted considerable interest within the traction community. The capability of new technologies to reduce the AC railway networks' effect on power quality and improve their supply efficiency is expected to significantly decrease the cost of electric rail supply systems. Of particular interest are Static Frequency Converter (SFC), Rail Power Conditioner (RPC), High Voltage Direct Current (HVDC) and Energy Storage Systems (ESS) solutions. Substantial impacts on future feasibility of railway electrification are anticipated. Aurizon, Australia's largest heavy haul railway operator, has recently commissioned the world's first 50Hz/50Hz SFC installation and is currently investigating SFC, RPC, HVDC and ESS solutions. This paper presents a summary of current and emerging technologies with a particular focus on the potential techno-economic benefits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Big Data and predictive analytics have received significant attention from the media and academic literature throughout the past few years, and it is likely that these emerging technologies will materially impact the mining sector. This short communication argues, however, that these technological forces will probably unfold differently in the mining industry than they have in many other sectors because of significant differences in the marginal cost of data capture and storage. To this end, we offer a brief overview of what Big Data and predictive analytics are, and explain how they are bringing about changes in a broad range of sectors. We discuss the “N=all” approach to data collection being promoted by many consultants and technology vendors in the marketplace but, by considering the economic and technical realities of data acquisition and storage, we then explain why a “n « all” data collection strategy probably makes more sense for the mining sector. Finally, towards shaping the industry’s policies with regards to technology-related investments in this area, we conclude by putting forward a conceptual model for leveraging Big Data tools and analytical techniques that is a more appropriate fit for the mining sector.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bread staling is a very complex phenomenon that is not yet completely understood. The present work explains how the electrical impedance spectroscopy technique can be utilized to investigate the effect of staling on the physicochemical properties of wheat bread during storage. An instrument based on electrical impedance spectroscopy technique is developed to study the electrical properties of wheat bread both at its crumb and crust with the help of designed multi-channel ring electrodes. Electrical impedance behavior, mainly capacitance and resistance, of wheat bread at crust and crumb during storage (up to 120 h) is investigated. The variation in capacitance showed the glass transition phenomenon at room temperature in bread crust after 96 h of storage with 18% of moisture in it. The resistance changes at bread crumb showed the starch recrystallization during staling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent advancements in the area of organic polymer applications demand novel and advanced materials with desirable surface, optical and electrical properties to employ in emerging technologies. This study examines the fabrication and characterization of polymer thin films from non-synthetic Terpinen-4-ol monomer using radio frequency plasma polymerization. The optical properties, thickness and roughness of the thin films were studied in the wavelength range 200–1000 nm using ellipsometry. The polymer thin films of thickness from 100 nm to 1000 nm were fabricated and the films exhibited smooth and defect-free surfaces. At 500 nm wavelength, the refractive index and extinction coefficient were found to be 1.55 and 0.0007 respectively. The energy gap was estimated to be 2.67 eV, the value falling into the semiconducting Eg region. The obtained optical and surface properties of Terpinen-4-ol based films substantiate their candidacy as a promising low-cost material with potential applications in electronics, optics, and biomedical industries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract It is widely considered that high pressure processing (HPP) results in better retention of micronutrients and phytochemicals compared to thermal pasteurization (TP), although some studies indicate that this may not be true in all cases. The aims of this study were (1) to objectively compare the effects of HPP under commercial processing conditions with thermal pasteurization (TP) on the stability of phenolic antioxidants in strawberries following processing and during storage and (2) to evaluate the influence of varietal differences and hence differences in biochemical composition of strawberries on the stability of phenolic antioxidants. Strawberry puree samples from cultivars Camarosa, Rubygem, and Festival were subjected to HPP (600 MPa/20 °C/5 min) and TP (88 °C/2 min). The activities of oxidative enzymes were evaluated before and after processing. Furthermore, the antioxidant capacity (total phenolic content (TPC), oxygen radical absorbance capacity (ORAC), and ferric reducing antioxidant power (FRAP)) and individual anthocyanins (by HPLC) were determined prior to and following processing and after three months of refrigerated storage (4 °C). Depending on the cultivar, HPP caused 15–38% and 20–33% inactivation of polyphenol oxidase and peroxidase, respectively, compared to almost complete inactivation of these enzymes by TP. Significant decreases (p < 0.05) in ORAC, FRAP, TPC and anthocyanin contents were observed during processing and storage of both HPP and TP samples. Anthocyanins were the most affected with only 19–25% retention after three months of refrigerated storage (4 °C). Slightly higher (p < 0.05) loss of TPC and antioxidant capacity were observed during storage of HPP samples compared to TP. Industrial Relevance: The results of the study demonstrated that both high pressure processing and thermal pasteurization result in high retention of phenolic phytochemicals in strawberry products. Under the conditions investigated, high pressure processing did not result in a better retention of phenolic phytochemicals compared to thermal pasteurization. In fact, a slightly higher loss of total polyphenol content and antioxidant capacity were observed during refrigerated storage of HPP processed samples. Our results showed that, high pressure processing may not always be a better alternative to thermal processing for strawberry puree processing if the main objective is better retention of phenolic antioxidants. However, it should be noted that other quality attributes such as sensory properties, where distinct advantages of HPP are expected, were outside the scope of this study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The global importance of grasslands is indicated by their extent; they comprise some 26% of total land area and 80% of agriculturally productive land. The majority of grasslands are located in tropical developing countries where they are particularly important to the livelihoods of some one billion poor peoples. Grasslands clearly provide the feed base for grazing livestock and thus numerous high-quality foods, but such livestock also provide products such as fertilizer, transport, traction, fibre and leather. In addition, grasslands provide important services and roles including as water catchments, biodiversity reserves, for cultural and recreational needs, and potentially a carbon sink to alleviate greenhouse gas emissions. Inevitably, such functions may conflict with management for production of livestock products. Much of the increasing global demand for meat and milk, particularly from developing countries, will have to be supplied from grassland ecosystems, and this will provide difficult challenges. Increased production of meat and milk generally requires increased intake of metabolizable energy, and thus increased voluntary intake and/or digestibility of diets selected by grazing animals. These will require more widespread and effective application of improved management. Strategies to improve productivity include fertilizer application, grazing management, greater use of crop by-products, legumes and supplements and manipulation of stocking rate and herbage allowance. However, it is often difficult to predict the efficiency and cost-effectiveness of such strategies, particularly in tropical developing country production systems. Evaluation and on-going adjustment of grazing systems require appropriate and reliable assessment criteria, but these are often lacking. A number of emerging technologies may contribute to timely low-cost acquisition of quantitative information to better understand the soil-pasture-animal interactions and animal management in grassland systems. Development of remote imaging of vegetation, global positioning technology, improved diet markers, near IR spectroscopy and modelling provide improved tools for knowledge-based decisions on the productivity constraints of grazing animals. Individual electronic identification of animals offers opportunities for precision management on an individual animal basis for improved productivity. Improved outcomes in the form of livestock products, services and/or other outcomes from grasslands should be possible, but clearly a diversity of solutions are needed for the vast range of environments and social circumstances of global grasslands.