953 resultados para emerging technology


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Literature is limited in its knowledge of the Bluetooth protocol based data acquisition process and in the accuracy and reliability of the analysis performed using the data. This paper extends the body of knowledge surrounding the use of data from the Bluetooth Media Access Control Scanner (BMS) as a complementary traffic data source. A multi layer simulation model named Traffic and Communication Simulation (TCS) is developed. TCS is utilised to model the theoretical properties of the BMS data and analyse the accuracy and reliability of travel time estimation using the BMS data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In vivo confocal microscopy (IVCM) is an emerging technology that provides minimally invasive, high resolution, steady-state assessment of the ocular surface at the cellular level. Several challenges still remain but, at present, IVCM may be considered a promising technique for clinical diagnosis and management. This mini-review summarizes some key findings in IVCM of the ocular surface, focusing on recent and promising attempts to move “from bench to bedside”. IVCM allows prompt diagnosis, disease course follow-up, and management of potentially blinding atypical forms of infectious processes, such as acanthamoeba and fungal keratitis. This technology has improved our knowledge of corneal alterations and some of the processes that affect the visual outcome after lamellar keratoplasty and excimer keratorefractive surgery. In dry eye disease, IVCM has provided new information on the whole-ocular surface morphofunctional unit. It has also improved understanding of pathophysiologic mechanisms and helped in the assessment of prognosis and treatment. IVCM is particularly useful in the study of corneal nerves, enabling description of the morphology, density, and disease- or surgically induced alterations of nerves, particularly the subbasal nerve plexus. In glaucoma, IVCM constitutes an important aid to evaluate filtering blebs, to better understand the conjunctival wound healing process, and to assess corneal changes induced by topical antiglaucoma medications and their preservatives. IVCM has significantly enhanced our understanding of the ocular response to contact lens wear. It has provided new perspectives at a cellular level on a wide range of contact lens complications, revealing findings that were not previously possible to image in the living human eye. The final section of this mini-review provides a focus on advances in confocal microscopy imaging. These include 2D wide-field mapping, 3D reconstruction of the cornea and automated image analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Organic light emitting diodes (OLEDs), as an emerging technology for display and solid state lighting application, have many advantages including self-emission, lightweight, flexibility, low driving voltage, low power consumption, and low production cost. With the advancement of light emitting materials development and device architecture optimization, mobile phones and televisions based on OLED technology are already in the market. However, to obtain efficient, stable and pure blue emission than producing lower-energy colors is still one of the important subjects of these challenges. Full color and pure white light can be achieved only having stable blue emitting materials. To address this issue, significant effort has been devoted to develop novel blue light emitting materials in the past decade aiming at further improving device efficiency, color quality of emission light, and device lifetime. This review focuses on recent efforts of synthesis and device performance of small molecules, oligomers and polymers for blue emission of organic electroluminescent devices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2009, the National Research Council of the National Academies released a report on A New Biology for the 21st Century. The council preferred the term ‘New Biology’ to capture the convergence and integration of the various disciplines of biology. The National Research Council stressed: ‘The essence of the New Biology, as defined by the committee, is integration—re-integration of the many sub-disciplines of biology, and the integration into biology of physicists, chemists, computer scientists, engineers, and mathematicians to create a research community with the capacity to tackle a broad range of scientific and societal problems.’ They define the ‘New Biology’ as ‘integrating life science research with physical science, engineering, computational science, and mathematics’. The National Research Council reflected: 'Biology is at a point of inflection. Years of research have generated detailed information about the components of the complex systems that characterize life––genes, cells, organisms, ecosystems––and this knowledge has begun to fuse into greater understanding of how all those components work together as systems. Powerful tools are allowing biologists to probe complex systems in ever greater detail, from molecular events in individual cells to global biogeochemical cycles. Integration within biology and increasingly fruitful collaboration with physical, earth, and computational scientists, mathematicians, and engineers are making it possible to predict and control the activities of biological systems in ever greater detail.' The National Research Council contended that the New Biology could address a number of pressing challenges. First, it stressed that the New Biology could ‘generate food plants to adapt and grow sustainably in changing environments’. Second, the New Biology could ‘understand and sustain ecosystem function and biodiversity in the face of rapid change’. Third, the New Biology could ‘expand sustainable alternatives to fossil fuels’. Moreover, it was hoped that the New Biology could lead to a better understanding of individual health: ‘The New Biology can accelerate fundamental understanding of the systems that underlie health and the development of the tools and technologies that will in turn lead to more efficient approaches to developing therapeutics and enabling individualized, predictive medicine.’ Biological research has certainly been changing direction in response to changing societal problems. Over the last decade, increasing awareness of the impacts of climate change and dwindling supplies of fossil fuels can be seen to have generated investment in fields such as biofuels, climate-ready crops and storage of agricultural genetic resources. In considering biotechnology’s role in the twenty-first century, biological future-predictor Carlson’s firm Biodesic states: ‘The problems the world faces today – ecosystem responses to global warming, geriatric care in the developed world or infectious diseases in the developing world, the efficient production of more goods using less energy and fewer raw materials – all depend on understanding and then applying biology as a technology.’ This collection considers the roles of intellectual property law in regulating emerging technologies in the biological sciences. Stephen Hilgartner comments that patent law plays a significant part in social negotiations about the shape of emerging technological systems or artefacts: 'Emerging technology – especially in such hotbeds of change as the life sciences, information technology, biomedicine, and nanotechnology – became a site of contention where competing groups pursued incompatible normative visions. Indeed, as people recognized that questions about the shape of technological systems were nothing less than questions about the future shape of societies, science and technology achieved central significance in contemporary democracies. In this context, states face ongoing difficulties trying to mediate these tensions and establish mechanisms for addressing problems of representation and participation in the sociopolitical process that shapes emerging technology.' The introduction to the collection will provide a thumbnail, comparative overview of recent developments in intellectual property and biotechnology – as a foundation to the collection. Section I of this introduction considers recent developments in United States patent law, policy and practice with respect to biotechnology – in particular, highlighting the Myriad Genetics dispute and the decision of the Supreme Court of the United States in Bilski v. Kappos. Section II considers the cross-currents in Canadian jurisprudence in intellectual property and biotechnology. Section III surveys developments in the European Union – and the interpretation of the European Biotechnology Directive. Section IV focuses upon Australia and New Zealand, and considers the policy responses to the controversy of Genetic Technologies Limited’s patents in respect of non-coding DNA and genomic mapping. Section V outlines the parts of the collection and the contents of the chapters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hand-held ebook readers present many challenges for Australian libraries that want to integrate this emerging technology into their library’s service. In 2001, both Toowoomba City Library and the Brisbane City Council Library Service embarked on such projects. This paper reports on the differing experience of these two public library services, outlining difficulties encountered, customer reactions to the technology, and the central issues that acquiring and circulating these readers pose for public libraries in Australia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nature, science and technology. The image of Finland through popular enlightenment texts 1870-1920 This doctoral thesis looks at how Finnish popular enlightenment texts published between 1870 and 1920 took part in the process of forming a genuine Finnish national identity. The same process was occurring in other Nordic countries at the time and the process in Finland was in many ways influenced by them, particularly Sweden. In Finland the political realities under Russian rule especially during the Russification years, and the fact that its history was considered to be short compared to other European countries, made this nation-building process unique. The undertaking was led by members of the national elite, influential in the cultural, academic as well as political arenas, who were keen to support the foundation of a modern Finnish identity. The political realities and national philosophy of history necessitated a search for elements of identity in nature and the Finnish landscape, which were considered to have special national importance: Finland was very much determined as a political entity on the basis of its geography and nature. Nature was also used as means of taking a cultural or political view in terms of, for example, geographical facts such as the nation s borders or the country s geographical connections to Western Europe. In the building of a proper national identity the concept of nature was not, however, static, but was more or less affected by political and economic progress in society. This meant that nature, or the image of the national landscape, was no longer seen only as a visual image of the national identity, but also as a source of science, technology and a prosperous future. The role of technology in this process was very much connected to the ability to harness natural resources to serve national interests. The major change in this respect had occurred by the early 20th century, when indisputable scientific progress altered the relationship between nature and technology. Concerning technology, the thesis is mainly interested in the large and at the time modern technological manifestations, such as railways, factories and industrial areas in Finland. Despite the fact that the symbiosis between national nature and international but successfully localized technology was in Finnish popular enlightenment literature depicted mostly as a national success story, concerns began to arise already in last years of the 19th century. It was argued that the emerging technology would eventually destroy Finland s natural environment, and therefore the basis of its national identity. The question was not how to preserve nature through natural science, but more how to conserve such natural resources and images that were considered to be the basis of national identity and thus of the national history. National parks, isolated from technology, and distant enough so as to have no economic value, were considered the solution to the problem. Methodologically the thesis belongs to the genre of science and technology studies, and offers new viewpoints with regard to both the study of Finnish popular enlightenment literature and the national development process as a whole.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Optometry students are taught the process of subjective refraction through lectures and laboratory based practicals before progressing to supervised clinical practice. Simulated learning environments (SLEs) are an emerging technology that are used in a range of health disciplines, however, there is limited evidence regarding the effectiveness of clinical simulators as an educational tool. Methods: Forty optometry students (20 fourth year and 20 fifth year) were assessed twice by a qualified optometrist (two examinations separated by 4-8 weeks) while completing a monocular non-cycloplegic subjective refraction on the same patient with an unknown refractive error simulated using contact lenses. Half of the students were granted access to an online SLE, The Brien Holden Vision Institute (BHVI®) Virtual Refractor, and the remaining students formed a control group. The primary outcome measures at each visit were; accuracy of the clinical refraction compared to a qualified optometrist and relative to the Optometry Council of Australia and New Zealand (OCANZ) subjective refraction examination criteria. Secondary measures of interest included descriptors of student SLE engagement, student self-reported confidence levels and correlations between performance in the simulated and real world clinical environment. Results: Eighty percent of students in the intervention group interacted with the SLE (for an average of 100 minutes); however, there was no correlation between measures of student engagement with the BHVI® Virtual Refractor and speed or accuracy of clinical subjective refractions. Fifth year students were typically more confident and refracted more accurately and quickly than fourth year students. A year group by experimental group interaction (p = 0.03) was observed for accuracy of the spherical component of refraction, and post hoc analysis revealed that less experienced students exhibited greater gains in clinical accuracy following exposure to the SLE intervention. Conclusions: Short-term exposure to a SLE can positively influence clinical subjective refraction outcomes for less experienced optometry students and may be of benefit in increasing the skills of novice refractionists to levels appropriate for commencing supervised clinical interactions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Body Area Network (BAN) is an emerging technology that focuses on monitoring physiological data in, on and around the human body. BAN technology permits wearable and implanted sensors to collect vital data about the human body and transmit it to other nodes via low-energy communication. In this paper, we investigate interactions in terms of data flows between parties involved in BANs under four different scenarios targeting outdoor and indoor medical environments: hospital, home, emergency and open areas. Based on these scenarios, we identify data flow requirements between BAN elements such as sensors and control units (CUs) and parties involved in BANs such as the patient, doctors, nurses and relatives. Identified requirements are used to generate BAN data flow models. Petri Nets (PNs) are used as the formal modelling language. We check the validity of the models and compare them with the existing related work. Finally, using the models, we identify communication and security requirements based on the most common active and passive attack scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Two-dimensional magnetic recording (2-D TDMR) is an emerging technology that aims to achieve areal densities as high as 10 Tb/in(2) using sophisticated 2-D signal-processing algorithms. High areal densities are achieved by reducing the size of a bit to the order of the size of magnetic grains, resulting in severe 2-D intersymbol interference (ISI). Jitter noise due to irregular grain positions on the magnetic medium is more pronounced at these areal densities. Therefore, a viable read-channel architecture for TDMR requires 2-D signal-detection algorithms that can mitigate 2-D ISI and combat noise comprising jitter and electronic components. Partial response maximum likelihood (PRML) detection scheme allows controlled ISI as seen by the detector. With the controlled and reduced span of 2-D ISI, the PRML scheme overcomes practical difficulties such as Nyquist rate signaling required for full response 2-D equalization. As in the case of 1-D magnetic recording, jitter noise can be handled using a data-dependent noise-prediction (DDNP) filter bank within a 2-D signal-detection engine. The contributions of this paper are threefold: 1) we empirically study the jitter noise characteristics in TDMR as a function of grain density using a Voronoi-based granular media model; 2) we develop a 2-D DDNP algorithm to handle the media noise seen in TDMR; and 3) we also develop techniques to design 2-D separable and nonseparable targets for generalized partial response equalization for TDMR. This can be used along with a 2-D signal-detection algorithm. The DDNP algorithm is observed to give a 2.5 dB gain in SNR over uncoded data compared with the noise predictive maximum likelihood detection for the same choice of channel model parameters to achieve a channel bit density of 1.3 Tb/in(2) with media grain center-to-center distance of 10 nm. The DDNP algorithm is observed to give similar to 10% gain in areal density near 5 grains/bit. The proposed signal-processing framework can broadly scale to various TDMR realizations and areal density points.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Imaging flow cytometry is an emerging technology that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy. It allows high-throughput imaging of cells with good spatial resolution, while they are in flow. This paper proposes a general framework for the processing/classification of cells imaged using imaging flow cytometer. Each cell is localized by finding an accurate cell contour. Then, features reflecting cell size, circularity and complexity are extracted for the classification using SVM. Unlike the conventional iterative, semi-automatic segmentation algorithms such as active contour, we propose a noniterative, fully automatic graph-based cell localization. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using custom fabricated cost-effective microfluidics-based imaging flow cytometer. The proposed system is a significant development in the direction of building a cost-effective cell analysis platform that would facilitate affordable mass screening camps looking cellular morphology for disease diagnosis. Lay description In this article, we propose a novel framework for processing the raw data generated using microfluidics based imaging flow cytometers. Microfluidics microscopy or microfluidics based imaging flow cytometry (mIFC) is a recent microscopy paradigm, that combines the statistical power of flow cytometry with spatial and quantitative morphology of digital microscopy, which allows us imaging cells while they are in flow. In comparison to the conventional slide-based imaging systems, mIFC is a nascent technology enabling high throughput imaging of cells and is yet to take the form of a clinical diagnostic tool. The proposed framework process the raw data generated by the mIFC systems. The framework incorporates several steps: beginning from pre-processing of the raw video frames to enhance the contents of the cell, localising the cell by a novel, fully automatic, non-iterative graph based algorithm, extraction of different quantitative morphological parameters and subsequent classification of cells. In order to evaluate the performance of the proposed framework, we have successfully classified unstained label-free leukaemia cell-lines MOLT, K562 and HL60 from video streams captured using cost-effective microfluidics based imaging flow cytometer. The cell lines of HL60, K562 and MOLT were obtained from ATCC (American Type Culture Collection) and are separately cultured in the lab. Thus, each culture contains cells from its own category alone and thereby provides the ground truth. Each cell is localised by finding a closed cell contour by defining a directed, weighted graph from the Canny edge images of the cell such that the closed contour lies along the shortest weighted path surrounding the centroid of the cell from a starting point on a good curve segment to an immediate endpoint. Once the cell is localised, morphological features reflecting size, shape and complexity of the cells are extracted and used to develop a support vector machine based classification system. We could classify the cell-lines with good accuracy and the results were quite consistent across different cross validation experiments. We hope that imaging flow cytometers equipped with the proposed framework for image processing would enable cost-effective, automated and reliable disease screening in over-loaded facilities, which cannot afford to hire skilled personnel in large numbers. Such platforms would potentially facilitate screening camps in low income group countries; thereby transforming the current health care paradigms by enabling rapid, automated diagnosis for diseases like cancer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the near extinction of many spawning aggregations of large grouper and snapper throughout the Caribbean, Gulf of Mexico, and tropical Atlantic, we need to provide baselines for their conservation. Thus, there is a critical need to develop techniques for rapidly assessing the remaining known (and unknown) aggregations. To this end we used mobile hydroacoustic surveys to estimate the density, spatial extent, and total abundance of a Nassau grouper spawning aggregation at Little Cayman Island, Cayman Islands, BWI. Hydroacoustic estimates of abundance, density, and spatial extent were similar on two sampling occasions. The location and approximate spatial extent of the Nassau grouper spawning aggregation near the shelf-break was corroborated by diver visual observations. Hydroacoustic density estimates were, overall, three-times higher than the average density observed by divers; however, we note that in some instances diver-estimated densities in localized areas were similar to hydroacoustic density estimates. The resolution of the hydroacoustic transects and geostatistical interpolation may have resulted in over-estimates in fish abundance, but still provided reasonable estimates of total spatial extent of the aggregation. Limitations in bottom time for scuba and visibility resulted in poor coverage of the entire Nassau grouper aggregation and low estimates of abundance when compared to hydroacoustic estimates. Although the majority of fish in the aggregation were well off bottom, fish that were sometimes in close proximity to the seafloor were not detected by the hydroacoustic survey. We conclude that diver observations of fish spawning aggregations are critical to interpretations of hydroacoustic surveys, and that hydroacoustic surveys provide a more accurate estimate of overall fish abundance and spatial extent than diver observations. Thus, hydroacoustics is an emerging technology that, when coupled with diver observations, provides a comprehensive survey method for monitoring spawning aggregations of fish.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper discusses the application of Geographic Information System (GIS) to fisheries management. The paper presents the importance of the emerging technology of GIS and how it can be utilized to greatly speed up and make more efficient location optimizing processes and how the technology can allow for a through examination of the many spatially variable factors which might affect or control fish production both from aquaculture and inland fisheries in Nigeria

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There has been an increasing interest in the use of unconventional materials and morphologies in robotic systems because the underlying mechanical properties (such as body shapes, elasticity, viscosity, softness, density and stickiness) are crucial research topics for our in-depth understanding of embodied intelligence. The detailed investigations of physical system-environment interactions are particularly important for systematic development of technologies and theories of emergent adaptive behaviors. Based on the presentations and discussion in the Future Emerging Technology (fet11) conference, this article introduces the recent technological development in the field of soft robotics, and speculates about the implications and challenges in the robotics and embodied intelligence research. © Selection and peer-review under responsibility of FET11 conference organizers and published by Elsevier B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Offshore seismic exploration is full of high investment and risk. And there are many problems, such as multiple. The technology of high resolution and high S/N ratio on marine seismic data processing is becoming an important project. In this paper, the technology of multi-scale decomposition on both prestack and poststack seismic data based on wavelet and Hilbert-Huang transform and the theory of phase deconvolution is proposed by analysis of marine seismic exploration, investigation and study of literatures, and integration of current mainstream and emerging technology. Related algorithms are studied. The Pyramid algorithm of decomposition and reconstruction had been given by the Mallat algorithm of discrete wavelet transform In this paper, it is introduced into seismic data processing, the validity is shown by test with field data. The main idea of Hilbert-Huang transform is the empirical mode decomposition with which any complicated data set can be decomposed into a finite and often small number of intrinsic mode functions that admit well-behaved Hilbert transform. After the decomposition, a analytical signal is constructed by Hilbert transform, from which the instantaneous frequency and amplitude can be obtained. And then, Hilbert spectrum. This decomposition method is adaptive and highly efficient. Since the decomposition is based on the local characteristics of the time scale of data, it is applicable to nonlinear and non-stationary processes. The phenomenons of fitting overshoot and undershoot and end swings are analyzed in Hilbert-Huang transform. And these phenomenons are eliminated by effective method which is studied in the paper. The technology of multi-scale decomposition on both prestack and poststack seismic data can realize the amplitude preserved processing, enhance the seismic data resolution greatly, and overcome the problem that different frequency components can not restore amplitude properly uniformly in the conventional method. The method of phase deconvolution, which has overcome the minimum phase limitation in traditional deconvolution, approached the base fact well that the seismic wavelet is phase mixed in practical application. And a more reliable result will be given by this method. In the applied research, the high resolution relative amplitude preserved processing result has been obtained by careful analysis and research with the application of the methods mentioned above in seismic data processing in four different target areas of China Sea. Finally, a set of processing flow and method system was formed in the paper, which has been carried on in the application in the actual production process and has made the good progress and the huge economic benefit.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis investigates the potential use of science communication models to engage a broader swathe of actors in decision making in relation to scientific and technological innovation in order to address possible democratic deficits in science and technology policy-making. A four-pronged research approach has been employed to examine different representations of the public(s) and different modes of engagement. The first case study investigates whether patient-groups could represent an alternative needs-driven approach to biomedical and health sciences R & D. This is followed by enquiry into the potential for Science Shops to represent a bottom-up approach to promote research and development of local relevance. The barriers and opportunities for the involvement of scientific researchers in science communication are next investigated via a national survey which is comparable to a similar survey conducted in the UK. The final case study investigates to what extent opposition or support regarding nanotechnology (as an emerging technology) is reflected amongst the YouTube user community and the findings are considered in the context of how support or opposition to new or emerging technologies can be addressed using conflict resolution based approaches to manage potential conflict trajectories. The research indicates that the majority of communication exercises of relevance to science policy and planning take the form of a one-way flow of information with little or no facility for public feedback. This thesis proposes that a more bottom-up approach to research and technology would help broaden acceptability and accountability for decisions made relating to new or existing technological trajectories. This approach could be better integrated with and complementary to government, institutional, e.g. university, and research funding agencies activities and help ensure that public needs and issues are better addressed directly by the research community. Such approaches could also facilitate empowerment of societal stakeholders regarding scientific literacy and agenda-setting. One-way information relays could be adapted to facilitate feedback from representative groups e.g. Non-governmental organisations or Civil Society Organisations (such as patient groups) in order to enhance the functioning and socio-economic relevance of knowledge-based societies to the betterment of human livelihoods.