953 resultados para emerging technology


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Nanoscience and technology (NST) are widely cited to be the defining technology for the 21st century. In recent years, the debate surrounding NST has become increasingly public, with much of this interest stemming from two radically opposing long-term visions of a NST-enabled future: ‘nano-optimism’ and ‘nano-pessimism’. This paper demonstrates that NST is a complex and wide-ranging discipline, the future of which is characterised by uncertainty. It argues that consideration of the present-day issues surrounding NST is essential if the public debate is to move forwards. In particular, the social constitution of an emerging technology is crucial if any meaningful discussion surrounding costs and benefits is to be realised. An exploration of the social constitution of NST raises a number of issues, of which unintended consequences and the interests of those who own and control new technologies are highlighted.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Powerpoint presentation presenting an overview of the ebook readers trial based at Deakin University.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This book focuses on network management and traffic engineering for Internet and distributed computing technologies, as well as present emerging technology trends and advanced platform

Relevância:

70.00% 70.00%

Publicador:

Resumo:

What's known on the subject? And what does the study add? We have previously shown that percutaneous radiofrequency ablation guided by image-fusion technology allows for precise needle placement with real time ultrasound superimposed with pre-loaded imaging, removing the need for real-time CT or MR guidance. Emerging technology also allows real-time tracking of a treatment needle within an organ in a virtually created 3D format. To our knowledge, this is the first study utilising a sophisticated ultrasound-based navigation system that uses both image-fusion and real-time probe-tracking technologies for in-vivo renal ablative intervention.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Previous research suggests that changing consumer and producer knowledge structures play a role in market evolution and that the sociocognitive processes of product markets are revealed in the sensemaking stories of market actors that are rebroadcasted in commercial publications. In this article, the authors lend further support to the story-based nature of market sensemaking and the use of the sociocognitive approach in explaining the evolution of high-technology markets. They examine the content (i.e., subject matter or topic) and volume (i.e., the number) of market stories and the extent to which content and volume of market stories evolve as a technology emerges. Data were obtained from a content analysis of 10,412 article abstracts, published in key trade journals, pertaining to Local Area Network (LAN) technologies and spanning the period 1981 to 2000. Hypotheses concerning the evolving nature (content and volume) of market stories in technology evolution are tested. The analysis identified four categories of market stories - technical, product availability, product adoption, and product discontinuation. The findings show that the emerging technology passes initially through a 'technical-intensive' phase whereby technology related stories dominate, through a 'supply-push' phase, in which stories presenting products embracing the technology tend to exceed technical stories while there is a rise in the number of product adoption reference stories, to a 'product-focus' phase, with stories predominantly focusing on product availability. Overall story volume declines when a technology matures as the need for sensemaking reduces. When stories about product discontinuation surface, these signal the decline of current technology. New technologies that fail to maintain the 'product-focus' stage also reflect limited market acceptance. The article also discusses the theoretical and managerial implications of the study's findings. © 2002 Elsevier Science Inc. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

By law, Title I schools employ teachers who are both competent in their subject knowledge and State certified. In addition, Title I teachers receive ongoing professional development in technology integration and are equipped with the latest innovative resources to integrate technology in the classroom. The aim is higher academic achievement and the effective use of technology in the classroom. The investment to implement technology in this large urban school district to improve student achievement has continued to increase. In order to infuse current and emerging technology throughout the curriculum, this school district needs to know where teachers have, and have not, integrated technology. Yet the level of how technology is integrated in Title I schools is unknown. This study used the Digital-Age Survey Levels of Teaching Innovation (LoTi) to assess 508 Title I teachers’ technology integration levels using three major initiatives purchased by Title I— the iPads program, the Chromebook initiative, and the interactive whiteboards program. The study used a quantitative approach. Descriptive statistics, regression analysis, and statistical correlations were used to examine the relationship between the level of technology integration and the following dependent variables: personal computer use (PCU), current instructional practices (CIP), and levels of teaching innovation (LoTi). With this information, budgetary decisions and professional development can be tailored to the meet the technology implementation needs of this district. The result of this study determined a significant relationship between the level of teaching innovation, personal computer use, and current instructional practices with teachers who teach with iPad, Chromebook, and/or interactive whiteboard. There was an increase in LoTi, PCU, and CIP scores with increasing years of experience of Title I teachers. There was also a significant relationship between teachers with 20 years or more teaching experience and their LoTi score.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Actions Towards Sustainable Outcomes Environmental Issues/Principal Impacts The increasing urbanisation of cities brings with it several detrimental consequences, such as: • Significant energy use for heating and cooling many more buildings has led to urban heat islands and increased greenhouse gas emissions. • Increased amount of hard surfaces, which not only contributes to higher temperatures in cities, but also to increased stormwater runoff. • Degraded air quality and noise. • Health and general well-being of people is frequently compromised, by inadequate indoor air quality. • Reduced urban biodiversity. Basic Strategies In many design situations, boundaries and constraints limit the application of cutting EDGe actions. In these circumstances, designers should at least consider the following: • Living walls are an emerging technology, and many Australian examples function more as internal feature walls. However,as understanding of the benefits and construction of living walls develops this technology could be part of an exterior facade that enhances a building’s thermal performance. • Living walls should be designed to function with an irrigation system using non-potable water. Cutting EDGe Strategies • Living walls can be part of a design strategy that effectively improves the thermal performance of a building, thereby contributing to lower energy use and greenhouse gas emissions. • Including living walls in the initial stages of design would provide greater flexibility to the design, especially of the facade, structural supports, mechanical ventilation and watering systems, thus lowering costs. • Designing a building with an early understanding of living walls can greatly reduce maintenance costs. • Including plant species and planting media that would be able to remove air impurities could contribute to improved indoor air quality, workplace productivity and well-being. Synergies and References • Living walls are a key research topic at the Centre for Subtropical Design, Queensland University of Technology: http://www.subtropicaldesign.bee.qut.edu.au • BEDP Environment Design Guide: DES 53: Roof and Facade Gardens • BEDP Environment Design Guide: GEN 4: Positive Development – Designing for Net Positive Impacts (see green scaffolding and green space frame walls). • Green Roofs Australia: www.greenroofs.wordpress.com • Green Roofs for Healthy Cities USA: www.greenroofs.org

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This book focuses on practical applications for using adult and embryonic stem cells in the pharmaceutical development process. It emphasizes new technologies to help overcome the bottlenecks in developing stem cells as therapeutic agents. A key reference for professionals working in stem cell science, it presents the general principles and methodologies in stem cell research and covers topics such as derivitization and characterization of stem cells, stem cell culture and maintenance, stem cell engineering, applications of high-throughput screening, and stem cell genetic modification with their use for drug delivery.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2005, Stephen Abram, vice president of Innovation at SirsiDynix, challenged library and information science (LIS) professionals to start becoming “librarian 2.0.” In the last few years, discussion and debate about the “core competencies” needed by librarian 2.0 have appeared in the “biblioblogosphere” (blogs written by LIS professionals). However, beyond these informal blog discussions few systematic and empirically based studies have taken place. This article will discuss a research project that fills this gap. Funded by the Australian Learning and Teaching Council, the project identifies the key skills, knowledge, and attributes required by “librarian 2.0.” Eighty-one members of the Australian LIS profession participated in a series of focus groups. Eight themes emerged as being critical to “librarian 2.0”: technology, communication, teamwork, user focus, business savvy, evidence based practice, learning and education, and personal traits. This article will provide a detailed discussion on each of these themes. The study’s findings also suggest that “librarian 2.0” is a state of mind, and that the Australian LIS profession is undergoing a significant shift in “attitude.”

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cell based therapies for bone regeneration are an exciting emerging technology, but the availability of osteogenic cells is limited and an ideal cell source has not been identified. Amniotic fluid-derived stem (AFS) cells and bone-marrow derived mesenchymal stem cells (MSCs) were compared to determine their osteogenic differentiation capacity in both 2D and 3D environments. In 2D culture, the AFS cells produced more mineralized matrix but delayed peaks in osteogenic markers. Cells were also cultured on 3D scaffolds constructed of poly-e-caprolactone for 15 weeks. MSCs differentiated more quickly than AFS cells on 3D scaffolds, but mineralized matrix production slowed considerably after 5 weeks. In contrast, the rate of AFS cell mineralization continued to increase out to 15 weeks, at which time AFS constructs contained 5-fold more mineralized matrix than MSC constructs. Therefore, cell source should be taken into consideration when used for cell therapy, as the MSCs would be a good choice for immediate matrix production, but the AFS cells would continue robust mineralization for an extended period of time. This study demonstrates that stem cell source can dramatically influence the magnitude and rate of osteogenic differentiation in vitro.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

EMR (Electronic Medical Record) is an emerging technology that is highly-blended between non-IT and IT area. One methodology is to link the non-IT and IT area is to construct databases. Nowadays, it supports before and after-treatment for patients and should satisfy all stakeholders such as practitioners, nurses, researchers, administrators and financial departments and so on. In accordance with the database maintenance, DAS (Data as Service) model is one solution for outsourcing. However, there are some scalability and strategy issues when we need to plan to use DAS model properly. We constructed three kinds of databases such as plan-text, MS built-in encryption which is in-house model and custom AES (Advanced Encryption Standard) - DAS model scaling from 5K to 2560K records. To perform custom AES-DAS better, we also devised Bucket Index using Bloom Filter. The simulation showed the response times arithmetically increased in the beginning but after a certain threshold, exponentially increased in the end. In conclusion, if the database model is close to in-house model, then vendor technology is a good way to perform and get query response times in a consistent manner. If the model is DAS model, it is easy to outsource the database, however, some techniques like Bucket Index enhances its utilization. To get faster query response times, designing database such as consideration of the field type is also important. This study suggests cloud computing would be a next DAS model to satisfy the scalability and the security issues.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Among various thermo-chemical conversion processes, pyrolysis is considered as an emerging technology for liquid oil production. The conversion of biomass waste in the form of plum seed into pyrolysis oil by fixed bed pyrolysis reactor has been taken into consideration in this study. A fixed bed pyrolysis has been designed and fabricated for obtaining liquid fuel from this plum seeds. The major component of the system are fixed bed pyrolysis reactor, liquid condenser and liquid collectors. The plum seed in particle form is pyrolysed in an externally heated 7.6 cm diameter and 46 cm high fixed bed reactor with nitrogen as the carrier gas. The reactor is heated by means of a biomass source cylindrical heater from 4000C to 6000C. The products are oil, char and gas. The reactor bed temperature, running time and feed particle size are considered as process parameters. The parameters are found to influence the product yield significantly. A maximum liquid yield of 39 wt% of biomass feed is obtained with particle size of 2.36-4.75 mm at a reactor bed temperature of 520oC with a running time of 120 minutes. The pyrolysis oil obtained at this optimum process conditions are analyzed for some fuel properties and compared with some other biomass derived pyrolysis oils and conventional fuels. The oil is found to possess favorable flash point and reasonable density and viscosity. The higher calorific value is found to be 22.39 MJ/kg which is higher than other biomass derived pyrolysis oils.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cognitive radio is an emerging technology proposing the concept of dynamic spec- trum access as a solution to the looming problem of spectrum scarcity caused by the growth in wireless communication systems. Under the proposed concept, non- licensed, secondary users (SU) can access spectrum owned by licensed, primary users (PU) so long as interference to PU are kept minimal. Spectrum sensing is a crucial task in cognitive radio whereby the SU senses the spectrum to detect the presence or absence of any PU signal. Conventional spectrum sensing assumes the PU signal as ‘stationary’ and remains in the same activity state during the sensing cycle, while an emerging trend models PU as ‘non-stationary’ and undergoes state changes. Existing studies have focused on non-stationary PU during the transmission period, however very little research considered the impact on spectrum sensing when the PU is non-stationary during the sensing period. The concept of PU duty cycle is developed as a tool to analyse the performance of spectrum sensing detectors when detecting non-stationary PU signals. New detectors are also proposed to optimise detection with respect to duty cycle ex- hibited by the PU. This research consists of two major investigations. The first stage investigates the impact of duty cycle on the performance of existing detec- tors and the extent of the problem in existing studies. The second stage develops new detection models and frameworks to ensure the integrity of spectrum sensing when detecting non-stationary PU signals. The first investigation demonstrates that conventional signal model formulated for stationary PU does not accurately reflect the behaviour of a non-stationary PU. Therefore the performance calculated and assumed to be achievable by the conventional detector does not reflect actual performance achieved. Through analysing the statistical properties of duty cycle, performance degradation is proved to be a problem that cannot be easily neglected in existing sensing studies when PU is modelled as non-stationary. The second investigation presents detectors that are aware of the duty cycle ex- hibited by a non-stationary PU. A two stage detection model is proposed to improve the detection performance and robustness to changes in duty cycle. This detector is most suitable for applications that require long sensing periods. A second detector, the duty cycle based energy detector is formulated by integrat- ing the distribution of duty cycle into the test statistic of the energy detector and suitable for short sensing periods. The decision threshold is optimised with respect to the traffic model of the PU, hence the proposed detector can calculate average detection performance that reflect realistic results. A detection framework for the application of spectrum sensing optimisation is proposed to provide clear guidance on the constraints on sensing and detection model. Following this framework will ensure the signal model accurately reflects practical behaviour while the detection model implemented is also suitable for the desired detection assumption. Based on this framework, a spectrum sensing optimisation algorithm is further developed to maximise the sensing efficiency for non-stationary PU. New optimisation constraints are derived to account for any PU state changes within the sensing cycle while implementing the proposed duty cycle based detector.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Building information modeling (BIM) is an emerging technology and process that provides rich and intelligent design information models of a facility, enabling enhanced communication, coordination, analysis, and quality control throughout all phases of a building project. Although there are many documented benefits of BIM for construction, identifying essential construction-specific information out of a BIM in an efficient and meaningful way is still a challenging task. This paper presents a framework that combines feature-based modeling and query processing to leverage BIM for construction. The feature-based modeling representation implemented enriches a BIM by representing construction-specific design features relevant to different construction management (CM) functions. The query processing implemented allows for increased flexibility to specify queries and rapidly generate the desired view from a given BIM according to the varied requirements of a specific practitioner or domain. Central to the framework is the formalization of construction domain knowledge in the form of a feature ontology and query specifications. The implementation of our framework enables the automatic extraction and querying of a wide-range of design conditions that are relevant to construction practitioners. The validation studies conducted demonstrate that our approach is significantly more effective than existing solutions. The research described in this paper has the potential to improve the efficiency and effectiveness of decision-making processes in different CM functions.