852 resultados para technology-enhanced assessment
Resumo:
Technology is increasingly infiltrating all aspects of our lives and the rapid uptake of devices that live near, on or in our bodies are facilitating radical new ways of working, relating and socialising. This distribution of technology into the very fabric of our everyday life creates new possibilities, but also raises questions regarding our future relationship with data and the quantified self. By embedding technology into the fabric of our clothes and accessories, it becomes ‘wearable’. Such ‘wearables’ enable the acquisition of and the connection to vast amounts of data about people and environments in order to provide life-augmenting levels of interactivity. Wearable sensors for example, offer the potential for significant benefits in the future management of our wellbeing. Fitness trackers such as ‘Fitbit’ and ‘Garmen’ provide wearers with the ability to monitor their personal fitness indicators while other wearables provide healthcare professionals with information that improves diagnosis. While the rapid uptake of wearables may offer unique and innovative opportunities, there are also concerns surrounding the high levels of data sharing that come as a consequence of these technologies. As more ‘smart’ devices connect to the Internet, and as technology becomes increasingly available (e.g. via Wi-Fi, Bluetooth), more products, artefacts and things are becoming interconnected. This digital connection of devices is called The ‘Internet of Things’ (IoT). IoT is spreading rapidly, with many traditionally non-online devices becoming increasingly connected; products such as mobile phones, fridges, pedometers, coffee machines, video cameras, cars and clothing. The IoT is growing at a rapid rate with estimates indicating that by 2020 there will be over 25 billion connected things globally. As the number of devices connected to the Internet increases, so too does the amount of data collected and type of information that is stored and potentially shared. The ability to collect massive amounts of data - known as ‘big data’ - can be used to better understand and predict behaviours across all areas of research from societal and economic to environmental and biological. With this kind of information at our disposal, we have a more powerful lens with which to perceive the world, and the resulting insights can be used to design more appropriate products, services and systems. It can however, also be used as a method of surveillance, suppression and coercion by governments or large organisations. This is becoming particularly apparent in advertising that targets audiences based on the individual preferences revealed by the data collected from social media and online devices such as GPS systems or pedometers. This type of technology also provides fertile ground for public debates around future fashion, identity and broader social issues such as culture, politics and the environment. The potential implications of these type of technological interactions via wearables, through and with the IoT, have never been more real or more accessible. But, as highlighted, this interconnectedness also brings with it complex technical, ethical and moral challenges. Data security and the protection of privacy and personal information will become ever more present in current and future ethical and moral debates of the 21st century. This type of technology is also a stepping-stone to a future that includes implantable technology, biotechnologies, interspecies communication and augmented humans (cyborgs). Technologies that live symbiotically and perpetually in our bodies, the built environment and the natural environment are no longer the stuff of science fiction; it is in fact a reality. So, where next?... The works exhibited in Wear Next_ provide a snapshot into the broad spectrum of wearables in design and in development internationally. This exhibition has been curated to serve as a platform for enhanced broader debate around future technology, our mediated future-selves and the evolution of human interactions. As you explore the exhibition, may we ask that you pause and think to yourself, what might we... Wear Next_? WEARNEXT ONLINE LISTINGS AND MEDIA COVERAGE: http://indulgemagazine.net/wear-next/ http://www.weekendnotes.com/wear-next-exhibition-gallery-artisan/ http://concreteplayground.com/brisbane/event/wear-next_/ http://www.nationalcraftinitiative.com.au/news_and_events/event/48/wear-next http://bneart.com/whats-on/wear-next_/ http://creativelysould.tumblr.com/post/124899079611/creative-weekend-art-edition http://www.abc.net.au/radionational/programs/breakfast/smartly-dressed-the-future-of-wearable-technology/6744374 http://couriermail.newspaperdirect.com/epaper/viewer.aspx RADIO COVERAGE http://www.abc.net.au/radionational/programs/breakfast/wear-next-exhibition-whats-next-for-wearable-technology/6745986 TELEVISION COVERAGE http://www.abc.net.au/radionational/programs/breakfast/wear-next-exhibition-whats-next-for-wearable-technology/6745986 https://au.news.yahoo.com/video/watch/29439742/how-you-could-soon-be-wearing-smart-clothes/#page1
Resumo:
A compact model for noise margin (NM) of single-electron transistor (SET) logic is developed, which is a function of device capacitances and background charge (zeta). Noise margin is, then, used as a metric to evaluate the robustness of SET logic against background charge, temperature, and variation of SET gate and tunnel junction capacitances (CG and CT). It is shown that choosing alpha=CT/CG=1/3 maximizes the NM. An estimate of the maximum tolerable zeta is shown to be equal to plusmn0.03 e. Finally, the effect of mismatch in device parameters on the NM is studied through exhaustive simulations, which indicates that a isin [0.3, 0.4] provides maximum robustness. It is also observed that mismatch can have a significant impact on static power dissipation.
Resumo:
Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under a single service class and multiple queues, and present a multi-layered pricing scheme. We propose an algorithm for finding the optimal state dependent price levels for individual queues, at each node. The pricing policy used depends on a weighted average queue length at each node. This helps in reducing frequent price variations and is in the spirit of the random early detection (RED) mechanism used in TCP/IP networks. We observe in our numerical results a considerable improvement in performance using our scheme over that of a recently proposed related scheme in terms of both throughput and delay performance. In particular, our approach exhibits a throughput improvement in the range of 34 to 69 percent in all cases studied (over all routes) over the above scheme.
Resumo:
Enhanced Scan design can significantly improve the fault coverage for two pattern delay tests at the cost of exorbitantly high area overhead. The redundant flip-flops introduced in the scan chains have traditionally only been used to launch the two-pattern delay test inputs, not to capture tests results. This paper presents a new, much lower cost partial Enhanced Scan methodology with both improved controllability and observability. Facilitating observation of some hard to observe internal nodes by capturing their response in the already available and underutilized redundant flip-flops improves delay fault coverage with minimal or almost negligible cost. Experimental results on ISCAS'89 benchmark circuits show significant improvement in TDF fault coverage for this new partial enhance scan methodology.
Resumo:
This research constructed a readability measurement for French speakers who view English as a second language. It identified the true cognates, which are the similar words from these two languages, as an indicator of the difficulty of an English text for French people. A multilingual lexical resource is used to detect true cognates in text, and Statistical Language Modelling to predict the predict the readability level. The proposed enhanced statistical language model is making a step in the right direction by improving the accuracy of readability predictions for French speakers by up to 10% compared to state of the art approaches. The outcome of this study could accelerate the learning process for French speakers who are studying English. More importantly, this study also benefits the readability estimation research community, presenting an approach and evaluation at sentence level as well as innovating with the use of cognates as a new text feature.
Resumo:
The Social Water Assessment Protocol (SWAP) is a tool consisting of a series of questions on fourteen themes designed to capture the social context of water around a mine site. A pilot study of the SWAP, conducted in Prestea-Huni Valley, Ghana, showed that some communities were concerned about whether the groundwater was potable. The mining company’s concern was that there was a cycle of dependency amongst communities that received treated water from the mining company. The pilot identified potential data sources and stakeholder groups for each theme, gaps in themes and suggested refinements to questions to improve the SWAP.
Resumo:
Agricultural pests are responsible for millions of dollars in crop losses and management costs every year. In order to implement optimal site-specific treatments and reduce control costs, new methods to accurately monitor and assess pest damage need to be investigated. In this paper we explore the combination of unmanned aerial vehicles (UAV), remote sensing and machine learning techniques as a promising technology to address this challenge. The deployment of UAVs as a sensor platform is a rapidly growing field of study for biosecurity and precision agriculture applications. In this experiment, a data collection campaign is performed over a sorghum crop severely damaged by white grubs (Coleoptera: Scarabaeidae). The larvae of these scarab beetles feed on the roots of plants, which in turn impairs root exploration of the soil profile. In the field, crop health status could be classified according to three levels: bare soil where plants were decimated, transition zones of reduced plant density and healthy canopy areas. In this study, we describe the UAV platform deployed to collect high-resolution RGB imagery as well as the image processing pipeline implemented to create an orthoimage. An unsupervised machine learning approach is formulated in order to create a meaningful partition of the image into each of the crop levels. The aim of the approach is to simplify the image analysis step by minimizing user input requirements and avoiding the manual data labeling necessary in supervised learning approaches. The implemented algorithm is based on the K-means clustering algorithm. In order to control high-frequency components present in the feature space, a neighbourhood-oriented parameter is introduced by applying Gaussian convolution kernels prior to K-means. The outcome of this approach is a soft K-means algorithm similar to the EM algorithm for Gaussian mixture models. The results show the algorithm delivers decision boundaries that consistently classify the field into three clusters, one for each crop health level. The methodology presented in this paper represents a venue for further research towards automated crop damage assessments and biosecurity surveillance.
Resumo:
There is on-going international interest in the relationships between assessment instruments, students’ understanding of science concepts and context-based curriculum approaches. This study extends earlier research showing that students can develop connections between contexts and concepts – called fluid transitions – when studying context-based courses. We provide an in-depth investigation of one student’s experiences with multiple contextual assessment instruments that were associated with a context-based course. We analyzed the student’s responses to context-based assessment instruments to determine the extent to which contextual tests, reports of field investigations, and extended experimental investigations afforded her opportunities to make connections between contexts and concepts. A system of categorizing student responses was developed that can inform other educators when analyzing student responses to contextual assessment. We also refine the theoretical construct of fluid transitions that informed the study initially. Implications for curriculum and assessment design are provided in light of the findings.
Resumo:
Nanotechnology is a new technology which is generating a lot of interest among academicians, practitioners and scientists. Critical research is being carried out in this area all over the world.Governments are creating policy initiatives to promote developments it the nanoscale science and technology developments. Private investment is also seeing a rising trend. Large number of academic institutions and national laboratories has set up research centers that are workingon the multiple applications of nanotechnology. Wide ranges of applications are claimed for nanotechnology. This consists of materials, chemicals, textiles, semiconductors, to wonder drug delivery systems and diagnostics. Nanotechnology is considered to be a next big wave of technology after information technology and biotechnology. In fact, nanotechnology holds the promise of advances that exceed those achieved in recent decades in computers and biotechnology. Much interest in nanotechnology also could be because of the fact that enormous monetary benefits are expected from nanotechnology based products. According to NSF, revenues from nanotechnology could touch $ 1 trillion by 2015. However much of the benefits are projected ones. Realizing claimed benefits require successful development of nanoscience andv nanotechnology research efforts. That is the journey of invention to innovation has to be completed. For this to happen the technology has to flow from laboratory to market. Nanoscience and nanotechnology research efforts have to come out in the form of new products, new processes, and new platforms.India has also started its Nanoscience and Nanotechnology development program in under its 10(th) Five Year Plan and funds worth Rs. One billion have been allocated for Nanoscience and Nanotechnology Research and Development. The aim of the paper is to assess Nanoscience and Nanotechnology initiatives in India. We propose a conceptual model derived from theresource based view of the innovation. We have developed a structured questionnaire to measure the constructs in the conceptual model. Responses have been collected from 115 scientists and engineers working in the field of Nanoscience and Nanotechnology. The responses have been analyzed further by using Principal Component Analysis, Cluster Analysis and Regression Analysis.
Resumo:
One of the foremost design considerations in microelectronics miniaturization is the use of embedded passives which provide practical solution. In a typical circuit, over 80 percent of the electronic components are passives such as resistors, inductors, and capacitors that could take up to almost 50 percent of the entire printed circuit board area. By integrating passive components within the substrate instead of being on the surface, embedded passives reduce the system real estate, eliminate the need for discrete and assembly, enhance electrical performance and reliability, and potentially reduce the overall cost. Moreover, it is lead free. Even with these advantages, embedded passive technology is at a relatively immature stage and more characterization and optimization are needed for practical applications leading to its commercialization.This paper presents an entire process from design and fabrication to electrical characterization and reliability test of embedded passives on multilayered microvia organic substrate. Two test vehicles focusing on resistors and capacitors have been designed and fabricated. Embedded capacitors in this study are made with polymer/ceramic nanocomposite (BaTiO3) material to take advantage of low processing temperature of polymers and relatively high dielectric constant of ceramics and the values of these capacitors range from 50 pF to 1.5 nF with capacitance per area of approximately 1.5 nF/cm(2). Limited high frequency measurement of these capacitors was performed. Furthermore, reliability assessments of thermal shock and temperature humidity tests based on JEDEC standards were carried out. Resistors used in this work have been of three types: 1) carbon ink based polymer thick film (PTF), 2) resistor foils with known sheet resistivities which are laminated to printed wiring board (PWB) during a sequential build-up (SBU) process and 3) thin-film resistor plating by electroless method. Realization of embedded resistors on conventional board-level high-loss epoxy (similar to 0.015 at 1 GHz) and proposed low-loss BCB dielectric (similar to 0.0008 at > 40 GHz) has been explored in this study. Ni-P and Ni-W-P alloys were plated using conventional electroless plating, and NiCr and NiCrAlSi foils were used for the foil transfer process. For the first time, Benzocyclobutene (BCB) has been proposed as a board level dielectric for advanced System-on-Package (SOP) module primarily due to its attractive low-loss (for RF application) and thin film (for high density wiring) properties.Although embedded passives are more reliable by eliminating solder joint interconnects, they also introduce other concerns such as cracks, delamination and component instability. More layers may be needed to accommodate the embedded passives, and various materials within the substrate may cause significant thermo -mechanical stress due to coefficient of thermal expansion (CTE) mismatch. In this work, numerical models of embedded capacitors have been developed to qualitatively examine the effects of process conditions and electrical performance due to thermo-mechanical deformations.Also, a prototype working product with the board level design including features of embedded resistors and capacitors are underway. Preliminary results of these are presented.
Resumo:
Human activities extract and displace different substances and materials from the earth s crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper I). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle assessment (LCA) (Papers II and III) and hybrid LCA, which combines both process and input-output LCA (Paper IV). The results showed that the environmental impacts caused by the extraction and processing of the materials and the energy used by the symbiosis were considerable. If only the direct emissions and resource use of the symbiosis had been considered, less than half of the total environmental impacts of the system would have been taken into account. When the results were compared with the counterfactual reference scenarios, the net environmental impacts of the symbiosis were smaller than those of the reference scenarios. The reduction in environmental impacts was mainly due to changes in the way energy was produced. However, the results are sensitive to the way the reference scenarios are defined. LCA is a useful tool for assessing the overall environmental performance of industrial symbioses. It is recommended that in addition to the direct effects, the upstream impacts should be taken into account as well when assessing the environmental performance of industrial symbioses. Industrial symbiosis should be seen as part of the process of improving the environmental performance of a system. In some cases, it may be more efficient, from an environmental point of view, to focus on supply chain management instead.
Resumo:
In the context of health care, information technology (IT) has an important role in the operational infrastructure, ranging from business management to patient care. An essential part of the system is medication management in inpatient and outpatient care. Community pharmacists strategy has been to extend practice responsibilities beyond dispensing towards patient care services. Few studies have evaluated the strategic development of IT systems to support this vision. The objectives of this study were to assess and compare independent Finnish community pharmacy owners and staff pharmacists priorities concerning the content and structure of the next generation of community pharmacy IT systems, to explore international experts visions and strategic views on IT development needs in relation to services provided in community pharmacies, to identify IT innovations facilitating patient care services and to evaluate their development and implementation processes, and to assess community pharmacists readiness to adopt innovations. This study applied both qualitative and quantitative methods. A qualitative personal interview of 14 experts in community pharmacy services and related IT from eight countries and a national survey of Finnish community pharmacy owners (mail survey, response rate 53%, n=308), and of a representative sample of staff pharmacists (online survey, response rate 22%, n=373) were conducted. Finnish independent community pharmacy owners gave priority to logistical functions but also to those related to medication information and patient care. The managers and staff pharmacists have different views of the importance of IT features, reflecting their different professional duties in the community pharmacy. This indicates the need for involving different occupation groups in planning the new IT systems for community pharmacies. A majority of the international experts shared the vision of community pharmacy adopting a patient care orientation; supported by IT-based documentation, new technological solutions, access to information, and shared patient data. Community pharmacy IT innovations were rare, which is paradoxical because owners and staff pharmacists perception of their innovativeness was seen as being high. Community pharmacy IT systems development processes usually had not undergone systematic needs assessment research beforehand or evaluation after the implementation and were most often coordinated by national governments without subsequent commercialization. Specifically, community pharmacy IT developments lack research, organization, leadership and user involvement in the process. Those responsible for IT development in the community pharmacy sector should create long-term IT development strategies that are in line with community pharmacy service development strategies. This could provide systematic guidance for future projects to ensure that potential innovations are based on a sufficient understanding of pharmacy practice problems that they are intended to solve, and to encourage strong leadership in research, development of innovations so that community pharmacists potential innovativeness is used, and that professional needs and strategic priorities will be considered even if the development process is led by those outside the profession.
Resumo:
This paper describes the implementation of wireless mesh nodes based on the IEEE 802.11s draft where the motivation is to build a real life mesh network. The mesh nodes developed have mesh, mesh access point and mesh portal functionalities simultaneously. The mesh nodes use different radios for mesh and access point functionalities, thus giving better service to client stations. Both reactive and proactive modes of HWMP are supported. The paper also suggests some measures to enhance the performance of the overall network by reducing the number of PREQs.
Resumo:
Design, analysis and technology for the integrity enhancement of damaged or underdesigned structures continues to be an engineering challenge. Bonded composite patch repairs to metallic structures is receiving increased attention in the recent years. It offers various advantages over rivetted doubler, particularly for airframe repairs. This paper presents an experimental investigation of residual strength and fatigue crack-growth life of an edge-cracked aluminium specimen repaired using glass epoxy composite patch. The investigation begins with the evaluation of three different surface treatments from bond strength viewpoint. A simple thumb rule formula is employed to estimate the patch size. Cracked and repaired specimens are tested under static and fatigue loading. The patch appears to restore the original strength of the undamaged specimen and enhance the fatigue crack growth life by an order of magnitude. (C) 1999 Elsevier Science Ltd. All rights reserved.
Resumo:
A systematic approach is developed for scaling analysis of momentum, heat and species conservation equations pertaining to the case of solidification of a binary mixture. The problem formulation and description of boundary conditions are kept fairly general, so that a large class of problems can be addressed. Analysis of the momentum equations coupled with phase change considerations leads to the establishment of an advection velocity scale. Analysis of the energy equation leads to an estimation of the solid layer thickness. Different regimes corresponding to different dominant modes of transport are simultaneously identified. A comparative study involving several cases of possible thermal boundary conditions is also performed. Finally, a scaling analysis of the species conservation equation is carried out, revealing the effect of a non-equilibrium solidification model on solute segregation and species distribution. It is shown that non-equilibrium effects result in an enhanced macrosegregation compared with the case of an equilibrium model. For the sake of assessment of the scaling analysis, the predictions are validated against corresponding computational results.